[ 597.548117] env[68914]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 598.172398] env[68964]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 599.502817] env[68964]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=68964) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 599.503192] env[68964]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=68964) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 599.503279] env[68964]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=68964) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 599.503595] env[68964]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 599.703163] env[68964]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=68964) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 599.714315] env[68964]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=68964) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 599.847371] env[68964]: INFO nova.virt.driver [None req-06d3c3e0-8c28-424c-9153-48485d0117b5 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 599.921975] env[68964]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.922182] env[68964]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.922284] env[68964]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=68964) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 602.847789] env[68964]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-a2d84ec2-ae9a-43b7-b6d8-4bec348126a2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.864721] env[68964]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=68964) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 602.864863] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-cf18f778-ec4b-4905-a88a-45b65742b694 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.898015] env[68964]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 6f209. [ 602.898172] env[68964]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.976s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 602.898700] env[68964]: INFO nova.virt.vmwareapi.driver [None req-06d3c3e0-8c28-424c-9153-48485d0117b5 None None] VMware vCenter version: 7.0.3 [ 602.902042] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1df1816-fc9e-4804-869b-f4308089084f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.923509] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a68c4c00-5a40-4064-9b3a-b862d947bc95 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.929493] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab786c56-4ec7-4fae-a227-eb203fa8273d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.936258] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caa21423-f0f8-4eb1-80c8-584767c2f101 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.949640] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f991a371-f75b-4c69-934c-1c0d3bcd7eec {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.955530] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1fd4f2d-a77d-4b68-81ab-7bde6539afe4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.986333] env[68964]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-9ea80168-8b10-4ffa-8764-d881ae210d56 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.991896] env[68964]: DEBUG nova.virt.vmwareapi.driver [None req-06d3c3e0-8c28-424c-9153-48485d0117b5 None None] Extension org.openstack.compute already exists. {{(pid=68964) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 602.994528] env[68964]: INFO nova.compute.provider_config [None req-06d3c3e0-8c28-424c-9153-48485d0117b5 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 603.013043] env[68964]: DEBUG nova.context [None req-06d3c3e0-8c28-424c-9153-48485d0117b5 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),66d8db0c-b367-47d9-a5ac-0eb6fb83feb6(cell1) {{(pid=68964) load_cells /opt/stack/nova/nova/context.py:464}} [ 603.014980] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 603.015315] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.016026] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 603.016485] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Acquiring lock "66d8db0c-b367-47d9-a5ac-0eb6fb83feb6" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 603.016678] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Lock "66d8db0c-b367-47d9-a5ac-0eb6fb83feb6" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.017673] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Lock "66d8db0c-b367-47d9-a5ac-0eb6fb83feb6" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 603.042341] env[68964]: INFO dbcounter [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Registered counter for database nova_cell0 [ 603.050747] env[68964]: INFO dbcounter [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Registered counter for database nova_cell1 [ 603.053712] env[68964]: DEBUG oslo_db.sqlalchemy.engines [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68964) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 603.054077] env[68964]: DEBUG oslo_db.sqlalchemy.engines [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68964) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 603.058561] env[68964]: DEBUG dbcounter [-] [68964] Writer thread running {{(pid=68964) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 603.059592] env[68964]: DEBUG dbcounter [-] [68964] Writer thread running {{(pid=68964) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 603.061375] env[68964]: ERROR nova.db.main.api [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 603.061375] env[68964]: result = function(*args, **kwargs) [ 603.061375] env[68964]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 603.061375] env[68964]: return func(*args, **kwargs) [ 603.061375] env[68964]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 603.061375] env[68964]: result = fn(*args, **kwargs) [ 603.061375] env[68964]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 603.061375] env[68964]: return f(*args, **kwargs) [ 603.061375] env[68964]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 603.061375] env[68964]: return db.service_get_minimum_version(context, binaries) [ 603.061375] env[68964]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 603.061375] env[68964]: _check_db_access() [ 603.061375] env[68964]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 603.061375] env[68964]: stacktrace = ''.join(traceback.format_stack()) [ 603.061375] env[68964]: [ 603.062417] env[68964]: ERROR nova.db.main.api [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 603.062417] env[68964]: result = function(*args, **kwargs) [ 603.062417] env[68964]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 603.062417] env[68964]: return func(*args, **kwargs) [ 603.062417] env[68964]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 603.062417] env[68964]: result = fn(*args, **kwargs) [ 603.062417] env[68964]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 603.062417] env[68964]: return f(*args, **kwargs) [ 603.062417] env[68964]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 603.062417] env[68964]: return db.service_get_minimum_version(context, binaries) [ 603.062417] env[68964]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 603.062417] env[68964]: _check_db_access() [ 603.062417] env[68964]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 603.062417] env[68964]: stacktrace = ''.join(traceback.format_stack()) [ 603.062417] env[68964]: [ 603.062753] env[68964]: WARNING nova.objects.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 603.062912] env[68964]: WARNING nova.objects.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Failed to get minimum service version for cell 66d8db0c-b367-47d9-a5ac-0eb6fb83feb6 [ 603.063371] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Acquiring lock "singleton_lock" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 603.063529] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Acquired lock "singleton_lock" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 603.063771] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Releasing lock "singleton_lock" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 603.064106] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Full set of CONF: {{(pid=68964) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 603.064251] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ******************************************************************************** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 603.064378] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] Configuration options gathered from: {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 603.064512] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 603.064696] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 603.064821] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ================================================================================ {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 603.065037] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] allow_resize_to_same_host = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.065241] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] arq_binding_timeout = 300 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.065380] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] backdoor_port = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.065507] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] backdoor_socket = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.065671] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] block_device_allocate_retries = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.065837] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] block_device_allocate_retries_interval = 3 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.066012] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cert = self.pem {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.066215] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.066403] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute_monitors = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.066573] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] config_dir = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.066742] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] config_drive_format = iso9660 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.066873] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.067045] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] config_source = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.067218] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] console_host = devstack {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.067385] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] control_exchange = nova {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.067544] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cpu_allocation_ratio = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.067703] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] daemon = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.067869] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] debug = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.068037] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] default_access_ip_network_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.068208] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] default_availability_zone = nova {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.068364] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] default_ephemeral_format = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.068522] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] default_green_pool_size = 1000 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.068759] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.068921] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] default_schedule_zone = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.069089] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] disk_allocation_ratio = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.069252] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] enable_new_services = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.069430] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] enabled_apis = ['osapi_compute'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.069591] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] enabled_ssl_apis = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.069747] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] flat_injected = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.069900] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] force_config_drive = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.070065] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] force_raw_images = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.070238] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] graceful_shutdown_timeout = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.070397] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] heal_instance_info_cache_interval = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.070609] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] host = cpu-1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.070778] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.070935] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] initial_disk_allocation_ratio = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.071103] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] initial_ram_allocation_ratio = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.071324] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.071484] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] instance_build_timeout = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.071641] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] instance_delete_interval = 300 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.071803] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] instance_format = [instance: %(uuid)s] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.071963] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] instance_name_template = instance-%08x {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.072136] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] instance_usage_audit = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.072307] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] instance_usage_audit_period = month {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.072467] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.072627] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] instances_path = /opt/stack/data/nova/instances {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.072790] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] internal_service_availability_zone = internal {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.072942] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] key = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.073124] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] live_migration_retry_count = 30 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.073312] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] log_config_append = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.073482] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.073644] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] log_dir = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.073807] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] log_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.073934] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] log_options = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.074108] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] log_rotate_interval = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.074279] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] log_rotate_interval_type = days {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.074445] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] log_rotation_type = none {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.074574] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.074699] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.074864] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.075036] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.075192] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.075365] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] long_rpc_timeout = 1800 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.075525] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] max_concurrent_builds = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.075682] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] max_concurrent_live_migrations = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.075839] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] max_concurrent_snapshots = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.075995] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] max_local_block_devices = 3 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.076189] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] max_logfile_count = 30 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.076356] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] max_logfile_size_mb = 200 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.076514] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] maximum_instance_delete_attempts = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.076680] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] metadata_listen = 0.0.0.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.076846] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] metadata_listen_port = 8775 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.077023] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] metadata_workers = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.077193] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] migrate_max_retries = -1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.077363] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] mkisofs_cmd = genisoimage {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.077570] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] my_block_storage_ip = 10.180.1.21 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.077701] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] my_ip = 10.180.1.21 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.077888] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] network_allocate_retries = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.078054] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.078228] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] osapi_compute_listen = 0.0.0.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.078392] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] osapi_compute_listen_port = 8774 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.078557] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] osapi_compute_unique_server_name_scope = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.078723] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] osapi_compute_workers = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.078883] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] password_length = 12 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.079053] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] periodic_enable = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.079215] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] periodic_fuzzy_delay = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.079385] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] pointer_model = usbtablet {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.079550] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] preallocate_images = none {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.079708] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] publish_errors = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.079836] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] pybasedir = /opt/stack/nova {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.079990] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ram_allocation_ratio = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.080163] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] rate_limit_burst = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.080328] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] rate_limit_except_level = CRITICAL {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.080487] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] rate_limit_interval = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.080643] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] reboot_timeout = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.080798] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] reclaim_instance_interval = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.080951] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] record = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.081130] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] reimage_timeout_per_gb = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.081298] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] report_interval = 120 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.081455] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] rescue_timeout = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.081612] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] reserved_host_cpus = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.081767] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] reserved_host_disk_mb = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.081921] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] reserved_host_memory_mb = 512 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.082090] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] reserved_huge_pages = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.082252] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] resize_confirm_window = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.082408] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] resize_fs_using_block_device = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.082566] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] resume_guests_state_on_host_boot = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.082731] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.082890] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] rpc_response_timeout = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.083057] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] run_external_periodic_tasks = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.083255] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] running_deleted_instance_action = reap {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.083425] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] running_deleted_instance_poll_interval = 1800 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.083584] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] running_deleted_instance_timeout = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.083740] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler_instance_sync_interval = 120 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.083908] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_down_time = 720 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.084089] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] servicegroup_driver = db {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.084255] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] shelved_offload_time = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.084415] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] shelved_poll_interval = 3600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.084580] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] shutdown_timeout = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.084744] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] source_is_ipv6 = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.084903] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ssl_only = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.085194] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.085370] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] sync_power_state_interval = 600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.085533] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] sync_power_state_pool_size = 1000 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.085704] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] syslog_log_facility = LOG_USER {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.085862] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] tempdir = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.086026] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] timeout_nbd = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.086220] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] transport_url = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.086390] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] update_resources_interval = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.086549] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] use_cow_images = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.086707] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] use_eventlog = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.086865] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] use_journal = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.087031] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] use_json = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.087194] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] use_rootwrap_daemon = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.087352] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] use_stderr = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.087507] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] use_syslog = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.087661] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vcpu_pin_set = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.087825] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plugging_is_fatal = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.087989] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plugging_timeout = 300 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.088168] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] virt_mkfs = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.088328] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] volume_usage_poll_interval = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.088485] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] watch_log_file = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.088653] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] web = /usr/share/spice-html5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 603.088840] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_concurrency.disable_process_locking = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.089145] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.089328] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.089494] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.089664] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.089832] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.089993] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.090189] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.auth_strategy = keystone {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.090360] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.compute_link_prefix = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.090534] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.090709] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.dhcp_domain = novalocal {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.090877] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.enable_instance_password = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.091050] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.glance_link_prefix = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.091219] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.091393] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.091555] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.instance_list_per_project_cells = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.091716] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.list_records_by_skipping_down_cells = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.091879] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.local_metadata_per_cell = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.092059] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.max_limit = 1000 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.092238] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.metadata_cache_expiration = 15 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.092412] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.neutron_default_tenant_id = default {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.092581] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.use_forwarded_for = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.092745] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.use_neutron_default_nets = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.092912] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.093097] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.093294] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.093474] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.093649] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.vendordata_dynamic_targets = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.093814] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.vendordata_jsonfile_path = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.093993] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.094202] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.backend = dogpile.cache.memcached {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.094372] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.backend_argument = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.094542] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.config_prefix = cache.oslo {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.094713] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.dead_timeout = 60.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.094878] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.debug_cache_backend = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.095051] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.enable_retry_client = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.095250] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.enable_socket_keepalive = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.095436] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.enabled = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.095602] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.expiration_time = 600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.095765] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.hashclient_retry_attempts = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.095929] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.hashclient_retry_delay = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.096116] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_dead_retry = 300 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.096307] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_password = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.096474] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.096637] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.096799] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_pool_maxsize = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.096961] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.097137] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_sasl_enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.097322] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.097489] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_socket_timeout = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.097656] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.memcache_username = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.097821] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.proxies = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.097985] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.retry_attempts = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.098164] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.retry_delay = 0.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.098330] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.socket_keepalive_count = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.098492] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.socket_keepalive_idle = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.098653] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.socket_keepalive_interval = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.098811] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.tls_allowed_ciphers = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.098967] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.tls_cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.099136] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.tls_certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.099296] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.tls_enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.099452] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cache.tls_keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.099620] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.auth_section = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.099794] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.auth_type = password {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.099954] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.100141] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.catalog_info = volumev3::publicURL {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.100304] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.100465] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.100627] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.cross_az_attach = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.100788] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.debug = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.100947] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.endpoint_template = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.101123] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.http_retries = 3 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.101290] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.101448] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.101619] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.os_region_name = RegionOne {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.101780] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.101939] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cinder.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.102369] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.102369] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.cpu_dedicated_set = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.102478] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.cpu_shared_set = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.102584] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.image_type_exclude_list = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.102749] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.102910] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.max_concurrent_disk_ops = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.103085] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.max_disk_devices_to_attach = -1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.103282] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.103458] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.103625] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.resource_provider_association_refresh = 300 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.103786] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.shutdown_retry_interval = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.103966] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.104160] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] conductor.workers = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.104341] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] console.allowed_origins = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.104501] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] console.ssl_ciphers = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.104672] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] console.ssl_minimum_version = default {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.104844] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] consoleauth.token_ttl = 600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.105025] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.105221] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.105391] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.105553] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.connect_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.105712] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.connect_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.105871] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.endpoint_override = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.106043] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.106232] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.106403] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.max_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.106564] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.min_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.106723] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.region_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.106881] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.service_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.107063] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.service_type = accelerator {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.107231] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.107392] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.status_code_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.107550] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.status_code_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.107705] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.107884] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.108057] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] cyborg.version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.108247] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.backend = sqlalchemy {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.108428] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.connection = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.108601] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.connection_debug = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.108774] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.connection_parameters = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.108938] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.connection_recycle_time = 3600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.109120] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.connection_trace = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.109285] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.db_inc_retry_interval = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.109451] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.db_max_retries = 20 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.109612] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.db_max_retry_interval = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.109774] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.db_retry_interval = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.109943] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.max_overflow = 50 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.110122] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.max_pool_size = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.110343] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.max_retries = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.110527] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.110690] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.mysql_wsrep_sync_wait = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.110851] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.pool_timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.111033] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.retry_interval = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.111777] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.slave_connection = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.111777] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.sqlite_synchronous = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.111777] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] database.use_db_reconnect = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.111777] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.backend = sqlalchemy {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.112026] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.connection = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.112026] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.connection_debug = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.112194] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.connection_parameters = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.112364] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.connection_recycle_time = 3600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.112534] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.connection_trace = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.112698] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.db_inc_retry_interval = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.112860] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.db_max_retries = 20 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.113037] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.db_max_retry_interval = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.113232] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.db_retry_interval = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.113416] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.max_overflow = 50 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.113639] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.max_pool_size = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.113832] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.max_retries = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.114014] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.114185] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.114354] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.pool_timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.114525] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.retry_interval = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.114686] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.slave_connection = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.114852] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] api_database.sqlite_synchronous = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.115035] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] devices.enabled_mdev_types = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.115244] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.115419] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ephemeral_storage_encryption.enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.115592] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.115766] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.api_servers = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.115932] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.116120] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.116312] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.116478] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.connect_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.116639] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.connect_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.116800] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.debug = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.116966] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.default_trusted_certificate_ids = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.117146] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.enable_certificate_validation = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.117311] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.enable_rbd_download = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.117471] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.endpoint_override = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.117631] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.117792] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.117949] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.max_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.118117] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.min_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.118283] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.num_retries = 3 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.118451] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.rbd_ceph_conf = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.118610] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.rbd_connect_timeout = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.118778] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.rbd_pool = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.118942] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.rbd_user = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.119110] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.region_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.119271] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.service_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.119439] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.service_type = image {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.119600] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.119758] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.status_code_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.119916] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.status_code_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.120083] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.120263] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.120429] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.verify_glance_signatures = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.120588] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] glance.version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.120753] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] guestfs.debug = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.120922] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.config_drive_cdrom = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.121097] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.config_drive_inject_password = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.121265] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.121430] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.enable_instance_metrics_collection = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.121594] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.enable_remotefx = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.121763] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.instances_path_share = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.121927] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.iscsi_initiator_list = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.122100] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.limit_cpu_features = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.122268] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.122432] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.122593] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.power_state_check_timeframe = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.122762] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.122932] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.123112] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.use_multipath_io = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.123302] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.volume_attach_retry_count = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.123468] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.123628] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.vswitch_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.123791] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.123960] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] mks.enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.124333] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.124526] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] image_cache.manager_interval = 2400 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.124696] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] image_cache.precache_concurrency = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.124868] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] image_cache.remove_unused_base_images = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.125048] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.125258] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.125448] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] image_cache.subdirectory_name = _base {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.125625] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.api_max_retries = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.125790] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.api_retry_interval = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.125948] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.auth_section = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.126141] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.auth_type = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.126321] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.126478] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.126642] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.126805] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.conductor_group = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.126964] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.connect_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.127136] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.connect_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.127297] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.endpoint_override = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.127461] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.127618] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.127778] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.max_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.127934] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.min_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.128108] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.peer_list = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.128271] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.region_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.128432] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.serial_console_state_timeout = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.128589] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.service_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.128758] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.service_type = baremetal {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.128920] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.129086] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.status_code_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.129246] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.status_code_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.129405] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.129585] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.129746] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ironic.version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.129929] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.130115] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] key_manager.fixed_key = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.130305] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.130467] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.barbican_api_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.130625] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.barbican_endpoint = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.130793] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.barbican_endpoint_type = public {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.130950] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.barbican_region_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.131121] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.131282] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.131443] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.131601] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.131757] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.131920] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.number_of_retries = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.132090] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.retry_delay = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.132258] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.send_service_user_token = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.132423] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.132582] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.132743] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.verify_ssl = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.132900] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican.verify_ssl_path = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.133089] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican_service_user.auth_section = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.133283] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican_service_user.auth_type = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.133450] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican_service_user.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.133618] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican_service_user.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.133781] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican_service_user.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.133943] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican_service_user.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.134114] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican_service_user.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.134280] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican_service_user.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.134438] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] barbican_service_user.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.134602] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.approle_role_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.134760] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.approle_secret_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.134919] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.135087] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.135283] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.135450] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.135609] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.135780] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.kv_mountpoint = secret {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.135937] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.kv_path = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.136142] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.kv_version = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.136364] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.namespace = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.136538] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.root_token_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.136706] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.136870] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.ssl_ca_crt_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.137042] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.137212] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.use_ssl = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.137387] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.137557] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.auth_section = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.137720] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.auth_type = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.137881] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.138058] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.138229] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.138391] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.connect_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.138551] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.connect_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.138709] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.endpoint_override = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.138871] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.139040] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.139212] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.max_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.139369] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.min_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.139525] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.region_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.139680] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.service_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.139848] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.service_type = identity {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.140016] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.140181] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.status_code_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.140341] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.status_code_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.140497] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.140674] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.140833] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] keystone.version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.141043] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.connection_uri = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.141209] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.cpu_mode = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.141381] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.cpu_model_extra_flags = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.141548] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.cpu_models = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.141717] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.cpu_power_governor_high = performance {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.141885] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.cpu_power_governor_low = powersave {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.142059] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.cpu_power_management = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.142236] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.142401] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.device_detach_attempts = 8 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.142563] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.device_detach_timeout = 20 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.142727] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.disk_cachemodes = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.142884] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.disk_prefix = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.143064] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.enabled_perf_events = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.143268] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.file_backed_memory = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.143441] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.gid_maps = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.143603] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.hw_disk_discard = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.143761] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.hw_machine_type = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.143929] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.images_rbd_ceph_conf = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.144104] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.144278] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.144445] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.images_rbd_glance_store_name = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.144611] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.images_rbd_pool = rbd {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.144779] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.images_type = default {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.144937] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.images_volume_group = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.145140] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.inject_key = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.145316] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.inject_partition = -2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.145484] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.inject_password = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.145647] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.iscsi_iface = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.145809] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.iser_use_multipath = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.145972] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_bandwidth = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.146179] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.146351] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_downtime = 500 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.146513] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.146679] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.146841] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_inbound_addr = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.147013] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.147181] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_permit_post_copy = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.147342] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_scheme = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.147515] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_timeout_action = abort {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.147677] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_tunnelled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.147836] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_uri = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.147996] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.live_migration_with_native_tls = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.148175] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.max_queues = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.148333] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.148490] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.nfs_mount_options = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.148802] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.148976] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.149156] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.num_iser_scan_tries = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.149320] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.num_memory_encrypted_guests = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.149484] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.149646] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.num_pcie_ports = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.149811] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.num_volume_scan_tries = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.149975] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.pmem_namespaces = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.150147] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.quobyte_client_cfg = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.150429] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.150602] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rbd_connect_timeout = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.150766] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.150934] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.151111] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rbd_secret_uuid = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.151274] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rbd_user = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.151439] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.151610] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.remote_filesystem_transport = ssh {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.151770] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rescue_image_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.151928] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rescue_kernel_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.152096] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rescue_ramdisk_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.152269] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.152433] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.rx_queue_size = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.152596] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.smbfs_mount_options = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.152868] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.153057] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.snapshot_compression = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.153248] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.snapshot_image_format = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.153478] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.153647] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.sparse_logical_volumes = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.153808] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.swtpm_enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.153975] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.swtpm_group = tss {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.154159] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.swtpm_user = tss {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.154329] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.sysinfo_serial = unique {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.154486] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.tb_cache_size = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.154645] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.tx_queue_size = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.154806] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.uid_maps = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.154970] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.use_virtio_for_bridges = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.155178] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.virt_type = kvm {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.155360] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.volume_clear = zero {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.155524] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.volume_clear_size = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.155692] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.volume_use_multipath = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.155852] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.vzstorage_cache_path = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.156029] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.156224] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.vzstorage_mount_group = qemu {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.156396] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.vzstorage_mount_opts = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.156565] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.156840] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.157028] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.vzstorage_mount_user = stack {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.157201] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.157377] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.auth_section = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.157553] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.auth_type = password {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.157713] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.157871] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.158045] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.158208] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.connect_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.158366] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.connect_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.158536] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.default_floating_pool = public {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.158693] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.endpoint_override = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.158857] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.extension_sync_interval = 600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.159022] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.http_retries = 3 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.159186] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.159348] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.159504] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.max_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.159672] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.159829] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.min_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.159995] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.ovs_bridge = br-int {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.160172] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.physnets = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.160342] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.region_name = RegionOne {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.160508] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.service_metadata_proxy = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.160667] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.service_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.160835] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.service_type = network {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.160999] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.161172] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.status_code_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.161332] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.status_code_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.161492] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.161671] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.161832] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] neutron.version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.162010] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] notifications.bdms_in_notifications = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.162195] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] notifications.default_level = INFO {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.162370] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] notifications.notification_format = unversioned {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.162534] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] notifications.notify_on_state_change = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.162709] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.162881] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] pci.alias = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.163057] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] pci.device_spec = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.163252] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] pci.report_in_placement = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.163432] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.auth_section = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.163605] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.auth_type = password {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.163773] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.163929] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.164099] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.164265] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.164422] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.connect_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.164579] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.connect_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.164736] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.default_domain_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.164890] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.default_domain_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.165058] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.domain_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.165261] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.domain_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.165428] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.endpoint_override = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.165591] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.165748] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.165904] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.max_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.166069] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.min_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.166276] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.password = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.166451] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.project_domain_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.166620] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.project_domain_name = Default {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.166786] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.project_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.166956] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.project_name = service {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.167140] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.region_name = RegionOne {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.167303] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.service_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.167470] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.service_type = placement {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.167630] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.167787] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.status_code_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.167943] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.status_code_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.168114] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.system_scope = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.168281] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.168435] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.trust_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.168592] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.user_domain_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.168759] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.user_domain_name = Default {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.168916] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.user_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.169098] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.username = placement {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.169284] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.169444] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] placement.version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.169620] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.cores = 20 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.169784] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.count_usage_from_placement = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.169955] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.170136] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.injected_file_content_bytes = 10240 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.170305] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.injected_file_path_length = 255 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.170469] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.injected_files = 5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.170883] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.instances = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.170883] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.key_pairs = 100 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.170972] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.metadata_items = 128 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.171116] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.ram = 51200 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.171281] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.recheck_quota = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.171447] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.server_group_members = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.171614] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] quota.server_groups = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.171781] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] rdp.enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.172103] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.172291] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.172458] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.172620] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.image_metadata_prefilter = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.172782] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.172946] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.max_attempts = 3 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.173147] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.max_placement_results = 1000 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.173330] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.173497] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.query_placement_for_image_type_support = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.173658] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.173832] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] scheduler.workers = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.174010] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.174190] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.174370] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.174537] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.174701] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.174863] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.175032] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.175253] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.175433] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.host_subset_size = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.175602] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.175764] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.175927] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.176121] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.isolated_hosts = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.176317] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.isolated_images = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.176488] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.176653] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.176816] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.176978] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.pci_in_placement = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.177156] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.177320] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.177486] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.177646] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.177808] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.177968] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.178143] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.track_instance_changes = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.178321] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.178491] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] metrics.required = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.178654] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] metrics.weight_multiplier = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.178815] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.178978] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] metrics.weight_setting = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.179288] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.179466] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] serial_console.enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.179638] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] serial_console.port_range = 10000:20000 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.179808] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.179973] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.180154] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] serial_console.serialproxy_port = 6083 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.180356] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.auth_section = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.180586] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.auth_type = password {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.180802] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.181135] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.181365] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.181531] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.181690] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.181885] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.send_service_user_token = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.182060] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.182222] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] service_user.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.182393] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.agent_enabled = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.182553] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.182843] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.183053] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.183237] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.html5proxy_port = 6082 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.183403] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.image_compression = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.183563] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.jpeg_compression = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.183722] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.playback_compression = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.183890] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.server_listen = 127.0.0.1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.184071] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.184238] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.streaming_mode = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.184399] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] spice.zlib_compression = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.184572] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] upgrade_levels.baseapi = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.184733] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] upgrade_levels.cert = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.184903] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] upgrade_levels.compute = auto {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.185076] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] upgrade_levels.conductor = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.185242] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] upgrade_levels.scheduler = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.185411] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vendordata_dynamic_auth.auth_section = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.185575] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vendordata_dynamic_auth.auth_type = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.185732] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vendordata_dynamic_auth.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.185890] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vendordata_dynamic_auth.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.186063] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.186256] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vendordata_dynamic_auth.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.186424] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vendordata_dynamic_auth.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.186589] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.186749] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vendordata_dynamic_auth.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.186935] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.api_retry_count = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.187109] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.ca_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.187292] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.cache_prefix = devstack-image-cache {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.187460] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.cluster_name = testcl1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.187622] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.connection_pool_size = 10 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.187778] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.console_delay_seconds = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.187945] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.datastore_regex = ^datastore.* {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.188174] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.188353] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.host_password = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.188519] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.host_port = 443 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.188686] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.host_username = administrator@vsphere.local {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.188854] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.insecure = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.189025] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.integration_bridge = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.189196] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.maximum_objects = 100 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.189358] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.pbm_default_policy = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.189518] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.pbm_enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.189685] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.pbm_wsdl_location = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.189853] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.190028] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.serial_port_proxy_uri = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.190192] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.serial_port_service_uri = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.190361] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.task_poll_interval = 0.5 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.190535] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.use_linked_clone = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.190703] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.vnc_keymap = en-us {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.190869] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.vnc_port = 5900 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.191041] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vmware.vnc_port_total = 10000 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.191237] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.auth_schemes = ['none'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.191413] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.191696] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.191878] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.192057] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.novncproxy_port = 6080 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.192238] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.server_listen = 127.0.0.1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.192411] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.192572] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.vencrypt_ca_certs = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.192731] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.vencrypt_client_cert = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.192887] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vnc.vencrypt_client_key = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.193069] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.193237] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.disable_deep_image_inspection = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.193400] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.193559] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.193719] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.193882] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.disable_rootwrap = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.194055] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.enable_numa_live_migration = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.194224] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.194391] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.194554] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.194714] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.libvirt_disable_apic = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.194874] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.195046] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.195239] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.195410] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.195571] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.195734] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.195894] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.196066] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.196274] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.196455] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.196655] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.196825] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.client_socket_timeout = 900 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.196990] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.default_pool_size = 1000 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.197171] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.keep_alive = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.197338] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.max_header_line = 16384 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.197499] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.secure_proxy_ssl_header = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.197657] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.ssl_ca_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.197813] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.ssl_cert_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.197971] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.ssl_key_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.198149] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.tcp_keepidle = 600 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.198324] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.198487] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] zvm.ca_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.198647] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] zvm.cloud_connector_url = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.198929] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.199114] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] zvm.reachable_timeout = 300 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.199308] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.enforce_new_defaults = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.199480] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.enforce_scope = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.199656] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.policy_default_rule = default {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.199844] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.200024] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.policy_file = policy.yaml {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.200201] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.200363] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.200519] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.200675] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.200834] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.201006] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.201188] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.201366] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.connection_string = messaging:// {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.201532] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.enabled = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.201699] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.es_doc_type = notification {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.201860] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.es_scroll_size = 10000 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.202034] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.es_scroll_time = 2m {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.202202] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.filter_error_trace = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.202375] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.hmac_keys = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.202542] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.sentinel_service_name = mymaster {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.202710] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.socket_timeout = 0.1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.202871] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.trace_requests = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.203042] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler.trace_sqlalchemy = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.203238] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler_jaeger.process_tags = {} {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.203402] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler_jaeger.service_name_prefix = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.203565] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] profiler_otlp.service_name_prefix = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.203742] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] remote_debug.host = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.203910] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] remote_debug.port = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.204103] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.204273] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.204437] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.204596] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.204758] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.204918] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.205086] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.205286] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.205455] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.205614] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.205781] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.205946] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.206154] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.206360] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.206534] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.206709] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.206871] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.207043] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.207254] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.207390] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.207580] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.207755] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.207915] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.208089] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.208261] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.208435] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.ssl = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.208606] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.208779] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.208941] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.209122] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.209294] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_rabbit.ssl_version = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.209481] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.209650] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_notifications.retry = -1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.209832] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.210021] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_messaging_notifications.transport_url = **** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.210198] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.auth_section = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.210363] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.auth_type = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.210521] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.cafile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.210696] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.certfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.210860] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.collect_timing = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.211027] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.connect_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.211195] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.connect_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.211352] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.endpoint_id = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.211506] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.endpoint_override = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.211663] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.insecure = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.211819] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.keyfile = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.211978] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.max_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.212213] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.min_version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.212390] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.region_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.212552] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.service_name = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.212753] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.service_type = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.212875] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.split_loggers = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.213032] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.status_code_retries = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.213197] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.status_code_retry_delay = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.213357] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.timeout = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.213516] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.valid_interfaces = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.213672] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_limit.version = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.213836] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_reports.file_event_handler = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.214007] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.214185] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] oslo_reports.log_dir = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.214360] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.214522] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.214681] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.214850] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.215019] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.215215] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.215398] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.215562] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_ovs_privileged.group = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.215722] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.215890] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.216065] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.216255] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] vif_plug_ovs_privileged.user = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.216432] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_linux_bridge.flat_interface = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.216614] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.216792] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.216967] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.217155] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.217330] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.217522] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.217700] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.217882] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.218064] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_ovs.isolate_vif = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.218239] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.218409] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.218580] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.218750] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_ovs.ovsdb_interface = native {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.218914] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_vif_ovs.per_port_bridge = False {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.219089] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_brick.lock_path = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.219258] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.219423] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.219594] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] privsep_osbrick.capabilities = [21] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.219753] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] privsep_osbrick.group = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.219909] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] privsep_osbrick.helper_command = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.220084] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.220253] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.220413] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] privsep_osbrick.user = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.220585] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.220742] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] nova_sys_admin.group = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.220897] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] nova_sys_admin.helper_command = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.221072] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.221238] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.221397] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] nova_sys_admin.user = None {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 603.221526] env[68964]: DEBUG oslo_service.service [None req-ee0f1341-ed24-4b42-8140-3621aee66998 None None] ******************************************************************************** {{(pid=68964) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 603.222007] env[68964]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 603.233512] env[68964]: WARNING nova.virt.vmwareapi.driver [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 603.233944] env[68964]: INFO nova.virt.node [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Generated node identity 63b0294e-f555-48a6-a542-3466427066a9 [ 603.234188] env[68964]: INFO nova.virt.node [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Wrote node identity 63b0294e-f555-48a6-a542-3466427066a9 to /opt/stack/data/n-cpu-1/compute_id [ 603.247340] env[68964]: WARNING nova.compute.manager [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Compute nodes ['63b0294e-f555-48a6-a542-3466427066a9'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 603.280380] env[68964]: INFO nova.compute.manager [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 603.305375] env[68964]: WARNING nova.compute.manager [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 603.305666] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 603.305902] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.306073] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 603.306237] env[68964]: DEBUG nova.compute.resource_tracker [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 603.307358] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b4b7635-e8dd-4898-92f1-7fa1ced31617 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.316118] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93b588c6-b191-4caf-99cd-c2d486786281 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.329879] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cd876bc-ce87-44b0-ba37-689ddc6033f6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.335853] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5bea99a-4d68-4336-b8d1-f8aa0d2e7cf7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.365615] env[68964]: DEBUG nova.compute.resource_tracker [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180974MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 603.365796] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 603.365956] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.379390] env[68964]: WARNING nova.compute.resource_tracker [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] No compute node record for cpu-1:63b0294e-f555-48a6-a542-3466427066a9: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 63b0294e-f555-48a6-a542-3466427066a9 could not be found. [ 603.393724] env[68964]: INFO nova.compute.resource_tracker [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 63b0294e-f555-48a6-a542-3466427066a9 [ 603.447882] env[68964]: DEBUG nova.compute.resource_tracker [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 603.448104] env[68964]: DEBUG nova.compute.resource_tracker [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 603.557707] env[68964]: INFO nova.scheduler.client.report [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] [req-d3104f32-7376-454c-9142-2c970e3f8582] Created resource provider record via placement API for resource provider with UUID 63b0294e-f555-48a6-a542-3466427066a9 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 603.580484] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddb898ba-58db-47d8-b244-00485e66dcfa {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.587801] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feee26de-bb3a-4d43-9c73-a09baec4a156 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.617510] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18fb077e-ff9e-4f46-8fa2-d89d6d21f0b6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.624963] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b0d51e6-c2e4-415d-8080-7a6530ea2b7f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.638038] env[68964]: DEBUG nova.compute.provider_tree [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Updating inventory in ProviderTree for provider 63b0294e-f555-48a6-a542-3466427066a9 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 603.678785] env[68964]: DEBUG nova.scheduler.client.report [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Updated inventory for provider 63b0294e-f555-48a6-a542-3466427066a9 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 603.679044] env[68964]: DEBUG nova.compute.provider_tree [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Updating resource provider 63b0294e-f555-48a6-a542-3466427066a9 generation from 0 to 1 during operation: update_inventory {{(pid=68964) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 603.679169] env[68964]: DEBUG nova.compute.provider_tree [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Updating inventory in ProviderTree for provider 63b0294e-f555-48a6-a542-3466427066a9 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 603.730380] env[68964]: DEBUG nova.compute.provider_tree [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Updating resource provider 63b0294e-f555-48a6-a542-3466427066a9 generation from 1 to 2 during operation: update_traits {{(pid=68964) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 603.749836] env[68964]: DEBUG nova.compute.resource_tracker [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 603.750057] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.384s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 603.750287] env[68964]: DEBUG nova.service [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Creating RPC server for service compute {{(pid=68964) start /opt/stack/nova/nova/service.py:182}} [ 603.764696] env[68964]: DEBUG nova.service [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] Join ServiceGroup membership for this service compute {{(pid=68964) start /opt/stack/nova/nova/service.py:199}} [ 603.764906] env[68964]: DEBUG nova.servicegroup.drivers.db [None req-1041fa93-9b8a-4e9b-914f-d7b7cc4cc0ad None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=68964) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 613.060681] env[68964]: DEBUG dbcounter [-] [68964] Writing DB stats nova_cell0:SELECT=1 {{(pid=68964) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 613.062292] env[68964]: DEBUG dbcounter [-] [68964] Writing DB stats nova_cell1:SELECT=1 {{(pid=68964) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 646.769791] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_power_states {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 646.788898] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Getting list of instances from cluster (obj){ [ 646.788898] env[68964]: value = "domain-c8" [ 646.788898] env[68964]: _type = "ClusterComputeResource" [ 646.788898] env[68964]: } {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 646.790400] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f128bc5-bb9b-44ac-bd93-2e3e583a7d14 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.801565] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Got total of 0 instances {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 646.801803] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 646.804243] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Getting list of instances from cluster (obj){ [ 646.804243] env[68964]: value = "domain-c8" [ 646.804243] env[68964]: _type = "ClusterComputeResource" [ 646.804243] env[68964]: } {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 646.805945] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38574319-3cdc-4bd0-84d0-4d42abaaac6c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.820300] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Got total of 0 instances {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 651.047918] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquiring lock "2857727a-d410-4021-ae38-bc2bf6aac400" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 651.048409] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Lock "2857727a-d410-4021-ae38-bc2bf6aac400" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 651.074281] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 651.203553] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 651.203821] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 651.206035] env[68964]: INFO nova.compute.claims [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 651.364356] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cb391e0-0eda-4639-8ceb-07bbf14f9f78 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.374496] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6488170-d1e9-4d83-a60a-dcb9e4ddfaf3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.413273] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0c41150-c7fc-4c9d-b0f3-3b345055d2e6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.422050] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16a2ec37-249d-490b-9088-c16bb0c2cfa5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.436638] env[68964]: DEBUG nova.compute.provider_tree [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 651.454306] env[68964]: DEBUG nova.scheduler.client.report [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 651.484941] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 651.485609] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 651.549987] env[68964]: DEBUG nova.compute.utils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 651.555044] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Not allocating networking since 'none' was specified. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 651.568508] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 651.680132] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 651.689032] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "4da35ed9-8646-45b8-b66f-715195a405a6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 651.689254] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "4da35ed9-8646-45b8-b66f-715195a405a6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 651.704159] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 651.801572] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 651.801900] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 651.803392] env[68964]: INFO nova.compute.claims [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 651.921757] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e70f5c4a-09ef-417a-bb32-b6f41fa07651 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.932113] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d000629-b275-4af4-a2c1-8a7676dd00c1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.970708] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 651.970959] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 651.971152] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 651.971405] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 651.971487] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 651.971632] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 651.972515] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 651.972690] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 651.973191] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 651.973293] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 651.973417] env[68964]: DEBUG nova.virt.hardware [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 651.974336] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73f9c29f-733c-494a-b1ad-5dd390265807 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.980328] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccdc4f28-c8e0-46e4-b0c8-24eb75f1aaf9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.989564] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec1e1f02-202c-4133-9d20-7f13fc2af7a3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.013404] env[68964]: DEBUG nova.compute.provider_tree [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 652.016221] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9bd873b-d8a7-4cb9-ba65-ee0fd98998b0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.036335] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1c62409-e389-4af9-ba6e-d1e3aa4a04d5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.050540] env[68964]: DEBUG nova.scheduler.client.report [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 652.064599] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Instance VIF info [] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 652.073845] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 652.076501] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cacccc35-b9f1-495d-8fcb-e4aab792d6b0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.077162] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 652.077648] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 652.093769] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Created folder: OpenStack in parent group-v4. [ 652.093978] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Creating folder: Project (ef3afa40b89b450caa5a21dcb80b95a8). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 652.094238] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bae5fda1-3d49-4444-ae1d-5c4f15fb1ee9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.105766] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Created folder: Project (ef3afa40b89b450caa5a21dcb80b95a8) in parent group-v684465. [ 652.105989] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Creating folder: Instances. Parent ref: group-v684466. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 652.106314] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-99a4d0c6-7a82-4662-85c4-829cbc4f99a4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.120021] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Created folder: Instances in parent group-v684466. [ 652.120021] env[68964]: DEBUG oslo.service.loopingcall [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 652.120021] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 652.120021] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6ff42310-0929-4043-bf50-0ed123e42e6f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.133743] env[68964]: DEBUG nova.compute.utils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 652.136392] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 652.136672] env[68964]: DEBUG nova.network.neutron [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 652.141313] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 652.141313] env[68964]: value = "task-3431494" [ 652.141313] env[68964]: _type = "Task" [ 652.141313] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 652.146432] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 652.159016] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431494, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 652.232183] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 652.270805] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 652.271114] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 652.271288] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 652.271478] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 652.271618] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 652.271770] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 652.271980] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 652.272181] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 652.272346] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 652.272504] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 652.272665] env[68964]: DEBUG nova.virt.hardware [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 652.273897] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dc3c3d0-5e39-43ed-80b0-0ca8fc999fe3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.286414] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-611ef5dd-feab-4f6e-afe0-549926693360 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.511771] env[68964]: DEBUG nova.policy [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a4aced3d14d4dd786a654eacf697bae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4e7b4eec810a4475a868d421674362cf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 652.659539] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquiring lock "4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.660785] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Lock "4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.661136] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431494, 'name': CreateVM_Task, 'duration_secs': 0.373697} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 652.661530] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 652.662618] env[68964]: DEBUG oslo_vmware.service [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-828b919b-9d3c-487b-9a70-b4391b5f1c55 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.673067] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 652.673067] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 652.673067] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 652.673067] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-284b0c56-7c17-4970-9ea6-c1c0e6354c6f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.675514] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 652.681255] env[68964]: DEBUG oslo_vmware.api [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Waiting for the task: (returnval){ [ 652.681255] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525e8128-9664-641e-228e-27bbadb3fe41" [ 652.681255] env[68964]: _type = "Task" [ 652.681255] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 652.700200] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 652.700485] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 652.702489] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 652.702489] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 652.702789] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 652.703051] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4421a6e0-038a-4de3-89d6-4104274ddfca {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.721912] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 652.722032] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 652.723959] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c27b624-d8ca-4b46-b1e8-40b5a38bccd9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.731535] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bbc7c4cf-206a-44c6-926b-5aae524b54dc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.743664] env[68964]: DEBUG oslo_vmware.api [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Waiting for the task: (returnval){ [ 652.743664] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]527db17f-ca47-9129-74f1-76613d04353a" [ 652.743664] env[68964]: _type = "Task" [ 652.743664] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 652.750215] env[68964]: DEBUG oslo_vmware.api [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]527db17f-ca47-9129-74f1-76613d04353a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 652.766120] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.766390] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.768015] env[68964]: INFO nova.compute.claims [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 652.922405] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ce35dd9-1651-4570-80f7-2538ee84993d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.931521] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a642405-80e5-43a4-abfb-805f60d01a75 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.964675] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00deb219-1ae2-44dd-8c95-5fba1894e04b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.972629] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-791ace7a-df71-42ea-85a0-f6bb270e1310 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.986750] env[68964]: DEBUG nova.compute.provider_tree [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 652.998369] env[68964]: DEBUG nova.scheduler.client.report [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 653.017902] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 653.017902] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 653.064095] env[68964]: DEBUG nova.compute.utils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 653.065519] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 653.065742] env[68964]: DEBUG nova.network.neutron [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 653.087971] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 653.197552] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 653.236941] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 653.236941] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 653.236941] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 653.237095] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 653.237095] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 653.237095] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 653.237196] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 653.237338] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 653.237496] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 653.237648] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 653.237811] env[68964]: DEBUG nova.virt.hardware [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 653.239020] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78c92db7-a1f3-4757-903c-338235f6f380 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.261174] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 653.261174] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Creating directory with path [datastore2] vmware_temp/361b6bae-a7ff-4e0f-a072-aa4e39f5931f/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 653.261174] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-677b7dc0-f7fc-4513-b742-54fca6482416 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.269060] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9a203b2e-03b4-4031-a352-f60ed6cdb394 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.301555] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Created directory with path [datastore2] vmware_temp/361b6bae-a7ff-4e0f-a072-aa4e39f5931f/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 653.301758] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Fetch image to [datastore2] vmware_temp/361b6bae-a7ff-4e0f-a072-aa4e39f5931f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 653.301926] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/361b6bae-a7ff-4e0f-a072-aa4e39f5931f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 653.302724] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f060eab9-af63-47de-b15e-0c4b999c7139 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.310612] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-970d72a6-754d-4828-a013-8b723ece6f42 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.322559] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9757b69d-e26d-4b61-95fa-6d9f1d470d06 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.362048] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-233f91c7-5661-420b-97c9-da0c60525504 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.368646] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ee942355-e756-4b82-aa53-b11fc05e7102 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 653.393367] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 653.480744] env[68964]: DEBUG oslo_vmware.rw_handles [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/361b6bae-a7ff-4e0f-a072-aa4e39f5931f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 653.487216] env[68964]: DEBUG nova.policy [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be4370d8b4af4843bff082a8bd5ae1da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b4e6ef53a924b9f9c4b1d56c1010707', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 653.562323] env[68964]: DEBUG oslo_vmware.rw_handles [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 653.562323] env[68964]: DEBUG oslo_vmware.rw_handles [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/361b6bae-a7ff-4e0f-a072-aa4e39f5931f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 654.288566] env[68964]: DEBUG nova.network.neutron [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Successfully created port: 36f2374e-b512-4dd8-8329-86aafca0fd6e {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 654.296921] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquiring lock "617f46a1-ca50-4561-9d0f-a596e35bf26d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.297191] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "617f46a1-ca50-4561-9d0f-a596e35bf26d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.313693] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 654.401817] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.402124] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.405306] env[68964]: INFO nova.compute.claims [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 654.656142] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquiring lock "ed73ed7d-e299-472a-805c-32bf83e96f8d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.656416] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Lock "ed73ed7d-e299-472a-805c-32bf83e96f8d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.659579] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaa8e54c-ef15-45c3-b316-64aeca6e7f94 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.675443] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ed7e590-92f8-44a3-bed5-f8d1a300576e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.682874] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 654.715259] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d3ede92-cac3-4a55-bfd6-4c9b15ec53e7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.722686] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc4c5c3b-814a-4da5-9c24-d6094297010c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.741365] env[68964]: DEBUG nova.compute.provider_tree [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 654.754169] env[68964]: DEBUG nova.scheduler.client.report [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 654.773286] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.371s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 654.773814] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 654.786760] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 654.786760] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 654.787257] env[68964]: INFO nova.compute.claims [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 654.836126] env[68964]: DEBUG nova.compute.utils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 654.837883] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 654.837883] env[68964]: DEBUG nova.network.neutron [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 654.856038] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 654.944897] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 654.957143] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0c0681b-3a34-4f64-8413-f24d06d7c725 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.968776] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-478d3831-3344-488f-8c28-2bc609abf26d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.006685] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 655.006986] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 655.007164] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 655.007339] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 655.007841] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 655.007841] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 655.007841] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 655.008110] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 655.008283] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 655.008441] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 655.008668] env[68964]: DEBUG nova.virt.hardware [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 655.009538] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18dbc76b-f07a-4d3c-acd1-857153fcf4ce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.012553] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddd82d71-896b-4765-ae7c-10d315e5d171 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.023048] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3abb60c2-984c-44ee-a97b-46ba2d0cbeda {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.028814] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-371e54a1-ed68-4591-b18f-771c1e04ccbb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.043819] env[68964]: DEBUG nova.compute.provider_tree [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 655.054522] env[68964]: DEBUG nova.scheduler.client.report [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 655.075258] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 655.075829] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 655.116586] env[68964]: DEBUG nova.compute.utils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 655.121303] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 655.121303] env[68964]: DEBUG nova.network.neutron [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 655.138094] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 655.226949] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 655.252018] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 655.252609] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 655.252609] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 655.252609] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 655.252754] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 655.252849] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 655.253071] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 655.253319] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 655.253388] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 655.253798] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 655.254032] env[68964]: DEBUG nova.virt.hardware [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 655.255300] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c921053-c14b-4f5f-a289-a31a85f6cc01 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.265227] env[68964]: DEBUG nova.policy [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e0ae927fa87c40b89cb2119f92b83d06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a86368d16cac467696c3558e5b05db85', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 655.267719] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47887fd7-c15a-4979-8743-30fc536baf93 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.579475] env[68964]: DEBUG nova.policy [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '04927d1686de452894b6ad1cd66d5377', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a9b12516e33045308393a7f8ea4b9925', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 655.625638] env[68964]: DEBUG nova.network.neutron [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Successfully created port: 6dc4021c-87ce-4ea8-947e-448bd6a3271d {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 657.123129] env[68964]: DEBUG nova.network.neutron [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Successfully created port: 27fb928e-c584-4c72-bd2c-00fff21ba929 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 657.128524] env[68964]: DEBUG nova.network.neutron [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Successfully created port: f8fb2ace-00cb-4c88-ab9d-76b7656d69a5 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 658.281084] env[68964]: DEBUG nova.network.neutron [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Successfully updated port: 36f2374e-b512-4dd8-8329-86aafca0fd6e {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 658.301924] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "refresh_cache-4da35ed9-8646-45b8-b66f-715195a405a6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 658.301924] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquired lock "refresh_cache-4da35ed9-8646-45b8-b66f-715195a405a6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 658.302098] env[68964]: DEBUG nova.network.neutron [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 658.447465] env[68964]: DEBUG nova.network.neutron [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Successfully updated port: 6dc4021c-87ce-4ea8-947e-448bd6a3271d {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 658.471031] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquiring lock "refresh_cache-4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 658.471433] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquired lock "refresh_cache-4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 658.471748] env[68964]: DEBUG nova.network.neutron [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 658.484017] env[68964]: DEBUG nova.network.neutron [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 658.610724] env[68964]: DEBUG nova.network.neutron [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 659.220677] env[68964]: DEBUG nova.network.neutron [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Updating instance_info_cache with network_info: [{"id": "6dc4021c-87ce-4ea8-947e-448bd6a3271d", "address": "fa:16:3e:e5:3f:69", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.41", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6dc4021c-87", "ovs_interfaceid": "6dc4021c-87ce-4ea8-947e-448bd6a3271d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 659.238379] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Releasing lock "refresh_cache-4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 659.238688] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Instance network_info: |[{"id": "6dc4021c-87ce-4ea8-947e-448bd6a3271d", "address": "fa:16:3e:e5:3f:69", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.41", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6dc4021c-87", "ovs_interfaceid": "6dc4021c-87ce-4ea8-947e-448bd6a3271d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 659.239179] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e5:3f:69', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f9c4edd5-d88e-4996-afea-00130ace0dad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6dc4021c-87ce-4ea8-947e-448bd6a3271d', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 659.248910] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Creating folder: Project (2b4e6ef53a924b9f9c4b1d56c1010707). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 659.250100] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9be940ed-8980-462c-86e7-d21fa760037b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.265379] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Created folder: Project (2b4e6ef53a924b9f9c4b1d56c1010707) in parent group-v684465. [ 659.265567] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Creating folder: Instances. Parent ref: group-v684469. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 659.265802] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c77c01ad-ec01-487b-80a1-6ac1824dd1a3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.275729] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Created folder: Instances in parent group-v684469. [ 659.276071] env[68964]: DEBUG oslo.service.loopingcall [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 659.276164] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 659.276371] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5203cc62-f8a3-4cb6-b92b-0359d54df4f3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.300220] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 659.300220] env[68964]: value = "task-3431497" [ 659.300220] env[68964]: _type = "Task" [ 659.300220] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 659.309644] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431497, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 659.578769] env[68964]: DEBUG nova.network.neutron [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Updating instance_info_cache with network_info: [{"id": "36f2374e-b512-4dd8-8329-86aafca0fd6e", "address": "fa:16:3e:ea:32:44", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap36f2374e-b5", "ovs_interfaceid": "36f2374e-b512-4dd8-8329-86aafca0fd6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 659.605614] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Releasing lock "refresh_cache-4da35ed9-8646-45b8-b66f-715195a405a6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 659.605614] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Instance network_info: |[{"id": "36f2374e-b512-4dd8-8329-86aafca0fd6e", "address": "fa:16:3e:ea:32:44", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap36f2374e-b5", "ovs_interfaceid": "36f2374e-b512-4dd8-8329-86aafca0fd6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 659.606121] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ea:32:44', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f9c4edd5-d88e-4996-afea-00130ace0dad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '36f2374e-b512-4dd8-8329-86aafca0fd6e', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 659.616785] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Creating folder: Project (4e7b4eec810a4475a868d421674362cf). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 659.617745] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f984feef-2c5f-4c2c-a887-5175e54a814b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.627026] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Created folder: Project (4e7b4eec810a4475a868d421674362cf) in parent group-v684465. [ 659.627172] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Creating folder: Instances. Parent ref: group-v684472. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 659.627415] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c40bd9e3-9c64-4877-9c87-b373d730dfe2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.642045] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Created folder: Instances in parent group-v684472. [ 659.642105] env[68964]: DEBUG oslo.service.loopingcall [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 659.642323] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 659.642485] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0b08737a-ffa5-4995-a93e-5130d5ea1acf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.662909] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 659.662909] env[68964]: value = "task-3431500" [ 659.662909] env[68964]: _type = "Task" [ 659.662909] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 659.671499] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431500, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 659.745060] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 659.745060] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 659.745060] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 659.745060] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 659.767454] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 659.767454] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 659.767454] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 659.767454] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 659.767454] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 659.767728] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 659.769831] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 659.770110] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 659.770335] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 659.770528] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 659.770715] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 659.770896] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 659.772134] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 659.772134] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 659.789165] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 659.790605] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 659.790605] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 659.790605] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 659.790776] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90909327-b45d-432f-8df7-70bb7c878e1e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.801065] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4515878b-65cf-4595-9538-e56deef9bde8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.826539] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30d53646-ee23-4c44-a934-7ad79b4e980d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.829169] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431497, 'name': CreateVM_Task} progress is 99%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 659.835290] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81f46d86-fa63-4955-a77b-342af222fcf2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 659.871734] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180974MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 659.871895] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 659.872109] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 659.980484] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2857727a-d410-4021-ae38-bc2bf6aac400 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 659.980484] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4da35ed9-8646-45b8-b66f-715195a405a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 659.980625] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 659.980723] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 617f46a1-ca50-4561-9d0f-a596e35bf26d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 659.980892] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ed73ed7d-e299-472a-805c-32bf83e96f8d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 659.981022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 659.981160] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 660.100195] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a4579f1-8635-47b9-a248-cf33d5e02953 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.108230] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05d4cf21-8c62-4cd8-8c6a-d1ed2d91c25f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.144641] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a73c5cb-b519-4a3e-9478-b2fd3c6e10ad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.152044] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c87a4aab-db54-4fd3-bbb1-24d8197f6139 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.167854] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 660.180531] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431500, 'name': CreateVM_Task} progress is 25%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 660.186281] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 660.207760] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 660.208119] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 660.319601] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431497, 'name': CreateVM_Task} progress is 99%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 660.349862] env[68964]: DEBUG nova.network.neutron [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Successfully updated port: 27fb928e-c584-4c72-bd2c-00fff21ba929 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 660.364119] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquiring lock "refresh_cache-617f46a1-ca50-4561-9d0f-a596e35bf26d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 660.364274] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquired lock "refresh_cache-617f46a1-ca50-4561-9d0f-a596e35bf26d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 660.364424] env[68964]: DEBUG nova.network.neutron [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 660.454107] env[68964]: DEBUG nova.network.neutron [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Successfully updated port: f8fb2ace-00cb-4c88-ab9d-76b7656d69a5 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 660.466947] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquiring lock "refresh_cache-ed73ed7d-e299-472a-805c-32bf83e96f8d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 660.466947] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquired lock "refresh_cache-ed73ed7d-e299-472a-805c-32bf83e96f8d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 660.466947] env[68964]: DEBUG nova.network.neutron [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 660.609225] env[68964]: DEBUG nova.network.neutron [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 660.652977] env[68964]: DEBUG nova.network.neutron [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 660.678620] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431500, 'name': CreateVM_Task} progress is 25%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 660.794090] env[68964]: DEBUG nova.compute.manager [req-b5713f84-85c1-4aff-8de4-d16744b44359 req-bc7be8e3-36ca-4e47-949b-d13e68896d48 service nova] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Received event network-vif-plugged-36f2374e-b512-4dd8-8329-86aafca0fd6e {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 660.794371] env[68964]: DEBUG oslo_concurrency.lockutils [req-b5713f84-85c1-4aff-8de4-d16744b44359 req-bc7be8e3-36ca-4e47-949b-d13e68896d48 service nova] Acquiring lock "4da35ed9-8646-45b8-b66f-715195a405a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 660.794594] env[68964]: DEBUG oslo_concurrency.lockutils [req-b5713f84-85c1-4aff-8de4-d16744b44359 req-bc7be8e3-36ca-4e47-949b-d13e68896d48 service nova] Lock "4da35ed9-8646-45b8-b66f-715195a405a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 660.794754] env[68964]: DEBUG oslo_concurrency.lockutils [req-b5713f84-85c1-4aff-8de4-d16744b44359 req-bc7be8e3-36ca-4e47-949b-d13e68896d48 service nova] Lock "4da35ed9-8646-45b8-b66f-715195a405a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 660.794997] env[68964]: DEBUG nova.compute.manager [req-b5713f84-85c1-4aff-8de4-d16744b44359 req-bc7be8e3-36ca-4e47-949b-d13e68896d48 service nova] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] No waiting events found dispatching network-vif-plugged-36f2374e-b512-4dd8-8329-86aafca0fd6e {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 660.795367] env[68964]: WARNING nova.compute.manager [req-b5713f84-85c1-4aff-8de4-d16744b44359 req-bc7be8e3-36ca-4e47-949b-d13e68896d48 service nova] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Received unexpected event network-vif-plugged-36f2374e-b512-4dd8-8329-86aafca0fd6e for instance with vm_state building and task_state spawning. [ 660.821164] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431497, 'name': CreateVM_Task, 'duration_secs': 1.364633} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 660.821164] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 660.838191] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 660.838402] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 660.838786] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 660.839071] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a4761a80-8d02-488f-aa05-3ccc4d489fb3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 660.853572] env[68964]: DEBUG oslo_vmware.api [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Waiting for the task: (returnval){ [ 660.853572] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52776f39-b364-351c-d61b-0ac69a7de28d" [ 660.853572] env[68964]: _type = "Task" [ 660.853572] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 660.866542] env[68964]: DEBUG oslo_vmware.api [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52776f39-b364-351c-d61b-0ac69a7de28d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 660.894563] env[68964]: DEBUG nova.compute.manager [req-a18c6ffe-489b-4516-867d-8aa38cdbb423 req-0af68c23-fd82-4c71-adb7-d869fd71aa0a service nova] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Received event network-vif-plugged-6dc4021c-87ce-4ea8-947e-448bd6a3271d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 660.894563] env[68964]: DEBUG oslo_concurrency.lockutils [req-a18c6ffe-489b-4516-867d-8aa38cdbb423 req-0af68c23-fd82-4c71-adb7-d869fd71aa0a service nova] Acquiring lock "4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 660.894563] env[68964]: DEBUG oslo_concurrency.lockutils [req-a18c6ffe-489b-4516-867d-8aa38cdbb423 req-0af68c23-fd82-4c71-adb7-d869fd71aa0a service nova] Lock "4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 660.894563] env[68964]: DEBUG oslo_concurrency.lockutils [req-a18c6ffe-489b-4516-867d-8aa38cdbb423 req-0af68c23-fd82-4c71-adb7-d869fd71aa0a service nova] Lock "4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 660.894819] env[68964]: DEBUG nova.compute.manager [req-a18c6ffe-489b-4516-867d-8aa38cdbb423 req-0af68c23-fd82-4c71-adb7-d869fd71aa0a service nova] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] No waiting events found dispatching network-vif-plugged-6dc4021c-87ce-4ea8-947e-448bd6a3271d {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 660.894819] env[68964]: WARNING nova.compute.manager [req-a18c6ffe-489b-4516-867d-8aa38cdbb423 req-0af68c23-fd82-4c71-adb7-d869fd71aa0a service nova] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Received unexpected event network-vif-plugged-6dc4021c-87ce-4ea8-947e-448bd6a3271d for instance with vm_state building and task_state spawning. [ 661.173084] env[68964]: DEBUG nova.network.neutron [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Updating instance_info_cache with network_info: [{"id": "f8fb2ace-00cb-4c88-ab9d-76b7656d69a5", "address": "fa:16:3e:c8:48:8d", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8fb2ace-00", "ovs_interfaceid": "f8fb2ace-00cb-4c88-ab9d-76b7656d69a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 661.189150] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431500, 'name': CreateVM_Task, 'duration_secs': 1.193098} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 661.189347] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 661.190071] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 661.202659] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquiring lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 661.202659] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 661.202659] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Releasing lock "refresh_cache-ed73ed7d-e299-472a-805c-32bf83e96f8d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 661.202814] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Instance network_info: |[{"id": "f8fb2ace-00cb-4c88-ab9d-76b7656d69a5", "address": "fa:16:3e:c8:48:8d", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8fb2ace-00", "ovs_interfaceid": "f8fb2ace-00cb-4c88-ab9d-76b7656d69a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 661.203745] env[68964]: DEBUG nova.network.neutron [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Updating instance_info_cache with network_info: [{"id": "27fb928e-c584-4c72-bd2c-00fff21ba929", "address": "fa:16:3e:e0:4f:f4", "network": {"id": "0bfd7d0e-18f7-49f0-99df-6153b8b82614", "bridge": "br-int", "label": "tempest-ServersTestJSON-186331016-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a86368d16cac467696c3558e5b05db85", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1cbd5e0e-9116-46f1-9748-13a73d2d7e75", "external-id": "nsx-vlan-transportzone-690", "segmentation_id": 690, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap27fb928e-c5", "ovs_interfaceid": "27fb928e-c584-4c72-bd2c-00fff21ba929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 661.207440] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c8:48:8d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f9c4edd5-d88e-4996-afea-00130ace0dad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f8fb2ace-00cb-4c88-ab9d-76b7656d69a5', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 661.216478] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Creating folder: Project (a9b12516e33045308393a7f8ea4b9925). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 661.217882] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e7755c04-8688-44f6-8b89-6cdd16de2d2e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.228604] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 661.234269] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Created folder: Project (a9b12516e33045308393a7f8ea4b9925) in parent group-v684465. [ 661.234557] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Creating folder: Instances. Parent ref: group-v684475. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 661.237109] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-98107adf-b1d9-4304-815d-30248ecaa0b8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.247392] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Created folder: Instances in parent group-v684475. [ 661.248725] env[68964]: DEBUG oslo.service.loopingcall [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 661.248725] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 661.248725] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0ecac4a5-20d2-4aca-a3e7-79be611bc902 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.267236] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Releasing lock "refresh_cache-617f46a1-ca50-4561-9d0f-a596e35bf26d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 661.268169] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Instance network_info: |[{"id": "27fb928e-c584-4c72-bd2c-00fff21ba929", "address": "fa:16:3e:e0:4f:f4", "network": {"id": "0bfd7d0e-18f7-49f0-99df-6153b8b82614", "bridge": "br-int", "label": "tempest-ServersTestJSON-186331016-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a86368d16cac467696c3558e5b05db85", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1cbd5e0e-9116-46f1-9748-13a73d2d7e75", "external-id": "nsx-vlan-transportzone-690", "segmentation_id": 690, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap27fb928e-c5", "ovs_interfaceid": "27fb928e-c584-4c72-bd2c-00fff21ba929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 661.268437] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e0:4f:f4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1cbd5e0e-9116-46f1-9748-13a73d2d7e75', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '27fb928e-c584-4c72-bd2c-00fff21ba929', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 661.276590] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Creating folder: Project (a86368d16cac467696c3558e5b05db85). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 661.278324] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fb167bfa-b8e3-4bfe-b0db-1413d083c989 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.279983] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 661.279983] env[68964]: value = "task-3431503" [ 661.279983] env[68964]: _type = "Task" [ 661.279983] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 661.288931] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431503, 'name': CreateVM_Task} progress is 6%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 661.290381] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Created folder: Project (a86368d16cac467696c3558e5b05db85) in parent group-v684465. [ 661.290607] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Creating folder: Instances. Parent ref: group-v684477. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 661.291388] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c20a98a9-764f-4bf3-8935-b3fb958f5704 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.309935] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Created folder: Instances in parent group-v684477. [ 661.310805] env[68964]: DEBUG oslo.service.loopingcall [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 661.310805] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 661.311550] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7b3f0e7d-9a2c-4d16-84ab-e643c015d488 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.329879] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 661.333636] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 661.333636] env[68964]: INFO nova.compute.claims [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 661.340623] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 661.340623] env[68964]: value = "task-3431506" [ 661.340623] env[68964]: _type = "Task" [ 661.340623] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 661.351966] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431506, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 661.365602] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 661.365736] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 661.365973] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 661.366215] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 661.366513] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 661.366805] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bbea2316-423d-4500-9ef5-7569816aa015 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.373090] env[68964]: DEBUG oslo_vmware.api [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for the task: (returnval){ [ 661.373090] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52e8c473-db2d-0df0-6b70-bfc513970e9d" [ 661.373090] env[68964]: _type = "Task" [ 661.373090] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 661.385293] env[68964]: DEBUG oslo_vmware.api [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52e8c473-db2d-0df0-6b70-bfc513970e9d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 661.562183] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69f544fe-72cb-4e9a-8db5-17488473fa6a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.571179] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8d74c6e-58dc-421a-bc43-4547985a4b2d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.613830] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df44f523-a787-4fc2-8a59-c27de65f2d9a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.622148] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ecfcd85-10ac-42f5-a688-65f94a68d70e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.637487] env[68964]: DEBUG nova.compute.provider_tree [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 661.649304] env[68964]: DEBUG nova.scheduler.client.report [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 661.675584] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 661.676213] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 661.725493] env[68964]: DEBUG nova.compute.utils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 661.726681] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 661.726681] env[68964]: DEBUG nova.network.neutron [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 661.738945] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 661.793322] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431503, 'name': CreateVM_Task, 'duration_secs': 0.34654} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 661.793731] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 661.795052] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 661.841748] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 661.853863] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431506, 'name': CreateVM_Task, 'duration_secs': 0.29769} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 661.854053] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 661.854719] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 661.868289] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 661.868542] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 661.868695] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 661.868873] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 661.869057] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 661.869177] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 661.869396] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 661.869556] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 661.869721] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 661.869926] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 661.870059] env[68964]: DEBUG nova.virt.hardware [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 661.870921] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5db3c6ae-9f19-4ae9-9b5b-6da255e86207 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.886278] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58590844-7fc4-44cf-a6c7-d93b61782559 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.892165] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 661.892165] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 661.892165] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 661.892421] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 661.894560] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 661.894560] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6138e226-8145-45df-9717-cfef9747d613 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.909730] env[68964]: DEBUG oslo_vmware.api [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Waiting for the task: (returnval){ [ 661.909730] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5212463f-0213-cc53-723c-6621bfdd47cd" [ 661.909730] env[68964]: _type = "Task" [ 661.909730] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 661.923911] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 661.924050] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 661.925656] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 661.925656] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 661.925656] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 661.925656] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fa6fec32-7552-4b47-bd73-b12a42d0de45 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 661.937077] env[68964]: DEBUG oslo_vmware.api [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Waiting for the task: (returnval){ [ 661.937077] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]522f5b64-0ef2-1690-0f2f-bd70f4df3fe3" [ 661.937077] env[68964]: _type = "Task" [ 661.937077] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 661.946015] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 661.946015] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 661.946015] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 662.154215] env[68964]: DEBUG nova.policy [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ded838b38224760b16e29cbee13d0ec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e7ee2d4e78324c5a91cd989f8717ae33', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 663.723466] env[68964]: DEBUG nova.network.neutron [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Successfully created port: 9c359d6f-bf7c-4806-8f54-9f44918cd679 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 664.481114] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Received event network-changed-36f2374e-b512-4dd8-8329-86aafca0fd6e {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 664.481320] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Refreshing instance network info cache due to event network-changed-36f2374e-b512-4dd8-8329-86aafca0fd6e. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 664.481608] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Acquiring lock "refresh_cache-4da35ed9-8646-45b8-b66f-715195a405a6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 664.481701] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Acquired lock "refresh_cache-4da35ed9-8646-45b8-b66f-715195a405a6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 664.481843] env[68964]: DEBUG nova.network.neutron [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Refreshing network info cache for port 36f2374e-b512-4dd8-8329-86aafca0fd6e {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 664.513019] env[68964]: DEBUG nova.compute.manager [req-01b57878-20ca-4364-8a1c-b3feaee62548 req-25ad95ad-72d4-4b72-9e4a-f95f717d7a46 service nova] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Received event network-changed-6dc4021c-87ce-4ea8-947e-448bd6a3271d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 664.513019] env[68964]: DEBUG nova.compute.manager [req-01b57878-20ca-4364-8a1c-b3feaee62548 req-25ad95ad-72d4-4b72-9e4a-f95f717d7a46 service nova] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Refreshing instance network info cache due to event network-changed-6dc4021c-87ce-4ea8-947e-448bd6a3271d. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 664.513019] env[68964]: DEBUG oslo_concurrency.lockutils [req-01b57878-20ca-4364-8a1c-b3feaee62548 req-25ad95ad-72d4-4b72-9e4a-f95f717d7a46 service nova] Acquiring lock "refresh_cache-4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 664.513019] env[68964]: DEBUG oslo_concurrency.lockutils [req-01b57878-20ca-4364-8a1c-b3feaee62548 req-25ad95ad-72d4-4b72-9e4a-f95f717d7a46 service nova] Acquired lock "refresh_cache-4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 664.513019] env[68964]: DEBUG nova.network.neutron [req-01b57878-20ca-4364-8a1c-b3feaee62548 req-25ad95ad-72d4-4b72-9e4a-f95f717d7a46 service nova] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Refreshing network info cache for port 6dc4021c-87ce-4ea8-947e-448bd6a3271d {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 666.233211] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquiring lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.233491] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.248110] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 666.329387] env[68964]: DEBUG nova.network.neutron [req-01b57878-20ca-4364-8a1c-b3feaee62548 req-25ad95ad-72d4-4b72-9e4a-f95f717d7a46 service nova] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Updated VIF entry in instance network info cache for port 6dc4021c-87ce-4ea8-947e-448bd6a3271d. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 666.329864] env[68964]: DEBUG nova.network.neutron [req-01b57878-20ca-4364-8a1c-b3feaee62548 req-25ad95ad-72d4-4b72-9e4a-f95f717d7a46 service nova] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Updating instance_info_cache with network_info: [{"id": "6dc4021c-87ce-4ea8-947e-448bd6a3271d", "address": "fa:16:3e:e5:3f:69", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.41", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6dc4021c-87", "ovs_interfaceid": "6dc4021c-87ce-4ea8-947e-448bd6a3271d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 666.333320] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.333635] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.336575] env[68964]: INFO nova.compute.claims [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 666.346470] env[68964]: DEBUG oslo_concurrency.lockutils [req-01b57878-20ca-4364-8a1c-b3feaee62548 req-25ad95ad-72d4-4b72-9e4a-f95f717d7a46 service nova] Releasing lock "refresh_cache-4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 666.361606] env[68964]: DEBUG nova.network.neutron [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Updated VIF entry in instance network info cache for port 36f2374e-b512-4dd8-8329-86aafca0fd6e. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 666.361606] env[68964]: DEBUG nova.network.neutron [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Updating instance_info_cache with network_info: [{"id": "36f2374e-b512-4dd8-8329-86aafca0fd6e", "address": "fa:16:3e:ea:32:44", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap36f2374e-b5", "ovs_interfaceid": "36f2374e-b512-4dd8-8329-86aafca0fd6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 666.374775] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Releasing lock "refresh_cache-4da35ed9-8646-45b8-b66f-715195a405a6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 666.375763] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Received event network-vif-plugged-27fb928e-c584-4c72-bd2c-00fff21ba929 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 666.376229] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Acquiring lock "617f46a1-ca50-4561-9d0f-a596e35bf26d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.376999] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Lock "617f46a1-ca50-4561-9d0f-a596e35bf26d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.377819] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Lock "617f46a1-ca50-4561-9d0f-a596e35bf26d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.378517] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] No waiting events found dispatching network-vif-plugged-27fb928e-c584-4c72-bd2c-00fff21ba929 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 666.378952] env[68964]: WARNING nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Received unexpected event network-vif-plugged-27fb928e-c584-4c72-bd2c-00fff21ba929 for instance with vm_state building and task_state spawning. [ 666.379286] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Received event network-vif-plugged-f8fb2ace-00cb-4c88-ab9d-76b7656d69a5 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 666.379705] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Acquiring lock "ed73ed7d-e299-472a-805c-32bf83e96f8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.380132] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Lock "ed73ed7d-e299-472a-805c-32bf83e96f8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.380428] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Lock "ed73ed7d-e299-472a-805c-32bf83e96f8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.380716] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] No waiting events found dispatching network-vif-plugged-f8fb2ace-00cb-4c88-ab9d-76b7656d69a5 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 666.381291] env[68964]: WARNING nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Received unexpected event network-vif-plugged-f8fb2ace-00cb-4c88-ab9d-76b7656d69a5 for instance with vm_state building and task_state spawning. [ 666.381578] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Received event network-changed-27fb928e-c584-4c72-bd2c-00fff21ba929 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 666.381855] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Refreshing instance network info cache due to event network-changed-27fb928e-c584-4c72-bd2c-00fff21ba929. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 666.382169] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Acquiring lock "refresh_cache-617f46a1-ca50-4561-9d0f-a596e35bf26d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 666.383020] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Acquired lock "refresh_cache-617f46a1-ca50-4561-9d0f-a596e35bf26d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 666.383020] env[68964]: DEBUG nova.network.neutron [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Refreshing network info cache for port 27fb928e-c584-4c72-bd2c-00fff21ba929 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 666.560698] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf9e3c0-821c-487c-9a4a-e65435396744 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.570472] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-749d12ed-d559-4b6c-bbec-c275a88858c9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.607938] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24ab3f09-5ced-45cd-be61-f0453406d428 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.616234] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8e09577-53a9-4c1c-89f9-32c908ead61c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.632038] env[68964]: DEBUG nova.compute.provider_tree [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 666.648809] env[68964]: DEBUG nova.scheduler.client.report [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 666.671328] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.338s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 666.672156] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 666.732949] env[68964]: DEBUG nova.compute.utils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 666.734960] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 666.735413] env[68964]: DEBUG nova.network.neutron [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 666.764080] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 666.853981] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 666.886528] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 666.886528] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 666.886528] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 666.886679] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 666.886679] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 666.886679] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 666.886679] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 666.886810] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 666.886848] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 666.887067] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 666.887433] env[68964]: DEBUG nova.virt.hardware [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 666.888342] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-794904e9-26b0-4332-aa20-27f4e710f50b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 666.897277] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eff76b48-d93a-4f86-8e80-5d8d7904b30b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 667.266209] env[68964]: DEBUG nova.policy [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '761657e10b3945dfae197bfa8b1c376e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4c1575f9a28d41c4bcec3a22545a5d37', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 667.544391] env[68964]: DEBUG nova.network.neutron [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Successfully updated port: 9c359d6f-bf7c-4806-8f54-9f44918cd679 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 667.564944] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquiring lock "refresh_cache-0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 667.564944] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquired lock "refresh_cache-0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 667.565203] env[68964]: DEBUG nova.network.neutron [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 667.800843] env[68964]: DEBUG nova.network.neutron [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 668.036868] env[68964]: DEBUG nova.network.neutron [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Updated VIF entry in instance network info cache for port 27fb928e-c584-4c72-bd2c-00fff21ba929. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 668.040499] env[68964]: DEBUG nova.network.neutron [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Updating instance_info_cache with network_info: [{"id": "27fb928e-c584-4c72-bd2c-00fff21ba929", "address": "fa:16:3e:e0:4f:f4", "network": {"id": "0bfd7d0e-18f7-49f0-99df-6153b8b82614", "bridge": "br-int", "label": "tempest-ServersTestJSON-186331016-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a86368d16cac467696c3558e5b05db85", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1cbd5e0e-9116-46f1-9748-13a73d2d7e75", "external-id": "nsx-vlan-transportzone-690", "segmentation_id": 690, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap27fb928e-c5", "ovs_interfaceid": "27fb928e-c584-4c72-bd2c-00fff21ba929", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 668.062813] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Releasing lock "refresh_cache-617f46a1-ca50-4561-9d0f-a596e35bf26d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 668.063252] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Received event network-changed-f8fb2ace-00cb-4c88-ab9d-76b7656d69a5 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 668.063549] env[68964]: DEBUG nova.compute.manager [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Refreshing instance network info cache due to event network-changed-f8fb2ace-00cb-4c88-ab9d-76b7656d69a5. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 668.064288] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Acquiring lock "refresh_cache-ed73ed7d-e299-472a-805c-32bf83e96f8d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 668.064772] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Acquired lock "refresh_cache-ed73ed7d-e299-472a-805c-32bf83e96f8d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 668.065611] env[68964]: DEBUG nova.network.neutron [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Refreshing network info cache for port f8fb2ace-00cb-4c88-ab9d-76b7656d69a5 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 668.718933] env[68964]: DEBUG nova.network.neutron [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Updating instance_info_cache with network_info: [{"id": "9c359d6f-bf7c-4806-8f54-9f44918cd679", "address": "fa:16:3e:e6:24:06", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c359d6f-bf", "ovs_interfaceid": "9c359d6f-bf7c-4806-8f54-9f44918cd679", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 668.737495] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Releasing lock "refresh_cache-0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 668.737495] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Instance network_info: |[{"id": "9c359d6f-bf7c-4806-8f54-9f44918cd679", "address": "fa:16:3e:e6:24:06", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c359d6f-bf", "ovs_interfaceid": "9c359d6f-bf7c-4806-8f54-9f44918cd679", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 668.737686] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e6:24:06', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f9c4edd5-d88e-4996-afea-00130ace0dad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9c359d6f-bf7c-4806-8f54-9f44918cd679', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 668.746629] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Creating folder: Project (e7ee2d4e78324c5a91cd989f8717ae33). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 668.747219] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1479a336-73ad-48db-80a3-91186c291ae7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 668.758115] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Created folder: Project (e7ee2d4e78324c5a91cd989f8717ae33) in parent group-v684465. [ 668.758555] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Creating folder: Instances. Parent ref: group-v684481. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 668.758555] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f1528d19-3da1-4c76-9692-a02af66a19c3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 668.767796] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Created folder: Instances in parent group-v684481. [ 668.768059] env[68964]: DEBUG oslo.service.loopingcall [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 668.768250] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 668.768448] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-913f2589-1c69-4767-8046-2ba7ce6e2642 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 668.792181] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 668.792181] env[68964]: value = "task-3431509" [ 668.792181] env[68964]: _type = "Task" [ 668.792181] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 668.800346] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431509, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 668.972520] env[68964]: DEBUG nova.network.neutron [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Successfully created port: 4095f464-bbab-4645-90f0-c0ae53be0abb {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 669.303154] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431509, 'name': CreateVM_Task, 'duration_secs': 0.31695} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 669.303335] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 669.303971] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 669.304266] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 669.304467] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 669.304720] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9c9fe08e-8b6a-4f7e-a8f4-82f4075658fd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 669.310011] env[68964]: DEBUG oslo_vmware.api [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Waiting for the task: (returnval){ [ 669.310011] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52201ea6-13d6-217d-69ec-152206f760b2" [ 669.310011] env[68964]: _type = "Task" [ 669.310011] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 669.318619] env[68964]: DEBUG oslo_vmware.api [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52201ea6-13d6-217d-69ec-152206f760b2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 669.419496] env[68964]: DEBUG nova.network.neutron [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Updated VIF entry in instance network info cache for port f8fb2ace-00cb-4c88-ab9d-76b7656d69a5. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 669.419909] env[68964]: DEBUG nova.network.neutron [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Updating instance_info_cache with network_info: [{"id": "f8fb2ace-00cb-4c88-ab9d-76b7656d69a5", "address": "fa:16:3e:c8:48:8d", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8fb2ace-00", "ovs_interfaceid": "f8fb2ace-00cb-4c88-ab9d-76b7656d69a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.434790] env[68964]: DEBUG oslo_concurrency.lockutils [req-e2e69c4a-1f5a-4a56-9a05-1f2cd198bfc9 req-a0d8a268-e9da-4fc9-8f54-0c9f1485dee7 service nova] Releasing lock "refresh_cache-ed73ed7d-e299-472a-805c-32bf83e96f8d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 669.824021] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 669.824021] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 669.824021] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 670.116599] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquiring lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.116879] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.132420] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 670.208363] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.208934] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.211044] env[68964]: INFO nova.compute.claims [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 670.243128] env[68964]: DEBUG nova.compute.manager [req-d476fa93-4ea0-480c-8194-a4a7f1d9a239 req-2dcc40ef-215d-4e09-80ab-9d384ef28db3 service nova] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Received event network-vif-plugged-9c359d6f-bf7c-4806-8f54-9f44918cd679 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 670.243345] env[68964]: DEBUG oslo_concurrency.lockutils [req-d476fa93-4ea0-480c-8194-a4a7f1d9a239 req-2dcc40ef-215d-4e09-80ab-9d384ef28db3 service nova] Acquiring lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.243550] env[68964]: DEBUG oslo_concurrency.lockutils [req-d476fa93-4ea0-480c-8194-a4a7f1d9a239 req-2dcc40ef-215d-4e09-80ab-9d384ef28db3 service nova] Lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.243708] env[68964]: DEBUG oslo_concurrency.lockutils [req-d476fa93-4ea0-480c-8194-a4a7f1d9a239 req-2dcc40ef-215d-4e09-80ab-9d384ef28db3 service nova] Lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 670.243866] env[68964]: DEBUG nova.compute.manager [req-d476fa93-4ea0-480c-8194-a4a7f1d9a239 req-2dcc40ef-215d-4e09-80ab-9d384ef28db3 service nova] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] No waiting events found dispatching network-vif-plugged-9c359d6f-bf7c-4806-8f54-9f44918cd679 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 670.244034] env[68964]: WARNING nova.compute.manager [req-d476fa93-4ea0-480c-8194-a4a7f1d9a239 req-2dcc40ef-215d-4e09-80ab-9d384ef28db3 service nova] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Received unexpected event network-vif-plugged-9c359d6f-bf7c-4806-8f54-9f44918cd679 for instance with vm_state building and task_state spawning. [ 670.423719] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fbd62ca-55bf-4d98-8b84-21f34c65613c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.432303] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d1cd067-857b-41ed-a424-90bc4c646c2f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.466026] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa80d15-a683-4028-bcee-943c4e045155 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.474385] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ef428c9-804e-4b4b-84a2-7deb7c9c00ad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.487702] env[68964]: DEBUG nova.compute.provider_tree [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 670.498252] env[68964]: DEBUG nova.scheduler.client.report [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 670.517827] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 670.517827] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 670.611043] env[68964]: DEBUG nova.compute.utils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 670.612413] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 670.612686] env[68964]: DEBUG nova.network.neutron [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 670.631336] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 670.735300] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 670.771980] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 670.775160] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 670.775160] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 670.775160] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 670.775160] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 670.775160] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 670.775386] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 670.775386] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 670.775386] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 670.775386] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 670.775511] env[68964]: DEBUG nova.virt.hardware [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 670.775511] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0a4541a-da32-433a-b73b-3cdd443310b2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.785678] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d51e3af8-9059-45e0-af32-ae00be457815 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.350087] env[68964]: DEBUG nova.policy [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c1bca4332e7744be9066a6c7591b9801', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9318cf282d8b4a83a87c0c481f2d1011', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 672.095459] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "329835df-cb38-495e-8a0e-539a396ddc74" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 672.095709] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "329835df-cb38-495e-8a0e-539a396ddc74" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.109704] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 672.131733] env[68964]: DEBUG nova.network.neutron [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Successfully updated port: 4095f464-bbab-4645-90f0-c0ae53be0abb {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 672.146979] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquiring lock "refresh_cache-81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 672.147641] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquired lock "refresh_cache-81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 672.147862] env[68964]: DEBUG nova.network.neutron [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 672.192727] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 672.193107] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.194780] env[68964]: INFO nova.compute.claims [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 672.272948] env[68964]: DEBUG nova.network.neutron [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 672.415813] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0294e082-24b6-41a0-a6a6-7cc0432215f8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.424891] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53a4848c-a47e-44ce-97b3-00ce9bf31567 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.475717] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d159009-7d17-4b8b-8dbc-8521ffb5a649 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.488399] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c537e3d-00d9-4894-b8ea-1a9a44b7c7e1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.508742] env[68964]: DEBUG nova.compute.provider_tree [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 672.528286] env[68964]: DEBUG nova.scheduler.client.report [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 672.558313] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.365s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 672.559118] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 672.642345] env[68964]: DEBUG nova.compute.utils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 672.647834] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Not allocating networking since 'none' was specified. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 672.666386] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 672.749765] env[68964]: DEBUG nova.network.neutron [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Successfully created port: 481d0494-b598-4dfd-aa1d-c0e941fe26ae {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 672.770633] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 672.801789] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 672.801960] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 672.802177] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 672.802684] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 672.802684] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 672.802684] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 672.802877] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 672.803147] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 672.803290] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 672.803845] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 672.803845] env[68964]: DEBUG nova.virt.hardware [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 672.804552] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39fa78e6-eaac-4013-8b79-fc93ac272b7a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.814026] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11ffe664-0d7d-40e2-a27b-41450e510f9f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.828938] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Instance VIF info [] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 672.834764] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Creating folder: Project (3fad75ad1351421d908a540b53709f4f). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 672.835210] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2a45e813-c898-461e-a989-64d5f164c99a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.845636] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Created folder: Project (3fad75ad1351421d908a540b53709f4f) in parent group-v684465. [ 672.846063] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Creating folder: Instances. Parent ref: group-v684484. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 672.846235] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a60feead-89d3-4470-9a87-03fcfa1f393b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.855658] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Created folder: Instances in parent group-v684484. [ 672.855940] env[68964]: DEBUG oslo.service.loopingcall [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 672.856274] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 672.856461] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9e2480ff-ee46-4c66-86c6-15ecc6690149 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.876728] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 672.876728] env[68964]: value = "task-3431512" [ 672.876728] env[68964]: _type = "Task" [ 672.876728] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 672.884661] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431512, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 672.939278] env[68964]: DEBUG nova.network.neutron [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Updating instance_info_cache with network_info: [{"id": "4095f464-bbab-4645-90f0-c0ae53be0abb", "address": "fa:16:3e:ad:fe:61", "network": {"id": "e47b8814-f451-4dc8-ad9e-ee3ee5aee474", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-470765249-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c1575f9a28d41c4bcec3a22545a5d37", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "41278529-8bd2-44a1-97c8-03967faa3ff7", "external-id": "nsx-vlan-transportzone-749", "segmentation_id": 749, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4095f464-bb", "ovs_interfaceid": "4095f464-bbab-4645-90f0-c0ae53be0abb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.953121] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Releasing lock "refresh_cache-81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 672.953460] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Instance network_info: |[{"id": "4095f464-bbab-4645-90f0-c0ae53be0abb", "address": "fa:16:3e:ad:fe:61", "network": {"id": "e47b8814-f451-4dc8-ad9e-ee3ee5aee474", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-470765249-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c1575f9a28d41c4bcec3a22545a5d37", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "41278529-8bd2-44a1-97c8-03967faa3ff7", "external-id": "nsx-vlan-transportzone-749", "segmentation_id": 749, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4095f464-bb", "ovs_interfaceid": "4095f464-bbab-4645-90f0-c0ae53be0abb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 672.953945] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ad:fe:61', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '41278529-8bd2-44a1-97c8-03967faa3ff7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4095f464-bbab-4645-90f0-c0ae53be0abb', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 672.961483] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Creating folder: Project (4c1575f9a28d41c4bcec3a22545a5d37). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 672.962152] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-baf279d9-3ae0-4c24-aacd-3b55633d2848 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.973272] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Created folder: Project (4c1575f9a28d41c4bcec3a22545a5d37) in parent group-v684465. [ 672.973272] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Creating folder: Instances. Parent ref: group-v684487. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 672.973437] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8895c38e-41a5-492a-98e8-cf6cfef46165 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.983661] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Created folder: Instances in parent group-v684487. [ 672.983661] env[68964]: DEBUG oslo.service.loopingcall [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 672.983847] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 672.984345] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7c8bd005-e42c-49bb-bb4d-8a693588a094 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.006994] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 673.006994] env[68964]: value = "task-3431515" [ 673.006994] env[68964]: _type = "Task" [ 673.006994] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 673.016116] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431515, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 673.080676] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquiring lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 673.081211] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 673.095452] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 673.178418] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 673.178676] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 673.180616] env[68964]: INFO nova.compute.claims [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 673.390721] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431512, 'name': CreateVM_Task, 'duration_secs': 0.303918} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 673.390721] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 673.390721] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 673.390721] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 673.390721] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 673.391079] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-12d35533-14f7-48fa-aff9-f62a1749c6b9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.398831] env[68964]: DEBUG oslo_vmware.api [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Waiting for the task: (returnval){ [ 673.398831] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52264105-ca96-ee02-d3f6-33338db3f992" [ 673.398831] env[68964]: _type = "Task" [ 673.398831] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 673.407976] env[68964]: DEBUG oslo_vmware.api [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52264105-ca96-ee02-d3f6-33338db3f992, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 673.477522] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80853bf1-92a2-4fc6-8c7b-4aca7f8dc807 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.485883] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-153cc257-8f9e-4820-8aa5-8296747b436e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.526292] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77995fdc-990c-418c-9bc9-8a9170436a4b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.536778] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431515, 'name': CreateVM_Task, 'duration_secs': 0.347519} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 673.536971] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 673.538607] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b65bf61-14aa-4891-954b-bbaf9882d49e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.545086] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 673.564913] env[68964]: DEBUG nova.compute.provider_tree [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 673.581937] env[68964]: DEBUG nova.scheduler.client.report [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 673.607842] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.428s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.607842] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 673.673016] env[68964]: DEBUG nova.compute.utils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 673.678205] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 673.678477] env[68964]: DEBUG nova.network.neutron [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 673.690916] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 673.799811] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 673.838649] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 673.838878] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 673.842754] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 673.842754] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 673.842917] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 673.843743] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 673.843743] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 673.843743] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 673.843743] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 673.843925] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 673.843925] env[68964]: DEBUG nova.virt.hardware [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 673.844836] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc74ad0e-a24a-4a90-bdee-d7feda51bb62 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.857959] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9dbab36-a252-480d-86bc-126f0b160403 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.908785] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 673.909065] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 673.909290] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 673.909505] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 673.910220] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 673.910513] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-67a8bb4e-fdbe-4218-810f-ad5fa4472795 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.915335] env[68964]: DEBUG oslo_vmware.api [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Waiting for the task: (returnval){ [ 673.915335] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5237f33b-43ca-5a2c-156c-446fb9f8431d" [ 673.915335] env[68964]: _type = "Task" [ 673.915335] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 673.926029] env[68964]: DEBUG oslo_vmware.api [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5237f33b-43ca-5a2c-156c-446fb9f8431d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 673.946258] env[68964]: DEBUG nova.policy [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f436a0fbfeed480c98cc4e8bfca197bf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a7897c354fc54193b870769dbab19c5f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 674.430993] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 674.430993] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 674.431502] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 675.337081] env[68964]: DEBUG nova.network.neutron [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Successfully created port: d1b6bc11-ca10-46a0-9ea5-f537a37e51bb {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 675.608752] env[68964]: DEBUG nova.network.neutron [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Successfully updated port: 481d0494-b598-4dfd-aa1d-c0e941fe26ae {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 675.626220] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquiring lock "refresh_cache-5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 675.626360] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquired lock "refresh_cache-5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 675.626512] env[68964]: DEBUG nova.network.neutron [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 676.008184] env[68964]: DEBUG nova.network.neutron [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 676.506630] env[68964]: DEBUG nova.compute.manager [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Received event network-changed-9c359d6f-bf7c-4806-8f54-9f44918cd679 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 676.507256] env[68964]: DEBUG nova.compute.manager [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Refreshing instance network info cache due to event network-changed-9c359d6f-bf7c-4806-8f54-9f44918cd679. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 676.509398] env[68964]: DEBUG oslo_concurrency.lockutils [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] Acquiring lock "refresh_cache-0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 676.509504] env[68964]: DEBUG oslo_concurrency.lockutils [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] Acquired lock "refresh_cache-0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 676.509635] env[68964]: DEBUG nova.network.neutron [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Refreshing network info cache for port 9c359d6f-bf7c-4806-8f54-9f44918cd679 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 676.788323] env[68964]: DEBUG nova.network.neutron [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Updating instance_info_cache with network_info: [{"id": "481d0494-b598-4dfd-aa1d-c0e941fe26ae", "address": "fa:16:3e:d6:9b:a9", "network": {"id": "d09d3627-423b-4a44-96c7-1d56d49d8b4c", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-107943923-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9318cf282d8b4a83a87c0c481f2d1011", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap481d0494-b5", "ovs_interfaceid": "481d0494-b598-4dfd-aa1d-c0e941fe26ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 676.801762] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Releasing lock "refresh_cache-5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 676.802329] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Instance network_info: |[{"id": "481d0494-b598-4dfd-aa1d-c0e941fe26ae", "address": "fa:16:3e:d6:9b:a9", "network": {"id": "d09d3627-423b-4a44-96c7-1d56d49d8b4c", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-107943923-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9318cf282d8b4a83a87c0c481f2d1011", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap481d0494-b5", "ovs_interfaceid": "481d0494-b598-4dfd-aa1d-c0e941fe26ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 676.803290] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d6:9b:a9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad4fcde7-8926-402a-a9b7-4878d2bc1cf6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '481d0494-b598-4dfd-aa1d-c0e941fe26ae', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 676.811727] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Creating folder: Project (9318cf282d8b4a83a87c0c481f2d1011). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 676.812401] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e6957681-74c5-4b76-a700-67c0125c05ce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.823925] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Created folder: Project (9318cf282d8b4a83a87c0c481f2d1011) in parent group-v684465. [ 676.824040] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Creating folder: Instances. Parent ref: group-v684490. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 676.824331] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bd11ed08-7216-41c6-bb14-e5bb22a82701 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.832967] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Created folder: Instances in parent group-v684490. [ 676.833310] env[68964]: DEBUG oslo.service.loopingcall [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 676.833548] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 676.833864] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c75b8bbc-9e8d-42d5-aac9-649fee62d0d2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 676.855489] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 676.855489] env[68964]: value = "task-3431518" [ 676.855489] env[68964]: _type = "Task" [ 676.855489] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 676.864683] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431518, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 677.370316] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431518, 'name': CreateVM_Task} progress is 99%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 677.792636] env[68964]: DEBUG nova.network.neutron [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Updated VIF entry in instance network info cache for port 9c359d6f-bf7c-4806-8f54-9f44918cd679. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 677.793291] env[68964]: DEBUG nova.network.neutron [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Updating instance_info_cache with network_info: [{"id": "9c359d6f-bf7c-4806-8f54-9f44918cd679", "address": "fa:16:3e:e6:24:06", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c359d6f-bf", "ovs_interfaceid": "9c359d6f-bf7c-4806-8f54-9f44918cd679", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 677.810604] env[68964]: DEBUG oslo_concurrency.lockutils [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] Releasing lock "refresh_cache-0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 677.810728] env[68964]: DEBUG nova.compute.manager [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Received event network-vif-plugged-4095f464-bbab-4645-90f0-c0ae53be0abb {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 677.810882] env[68964]: DEBUG oslo_concurrency.lockutils [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] Acquiring lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 677.811180] env[68964]: DEBUG oslo_concurrency.lockutils [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] Lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 677.811370] env[68964]: DEBUG oslo_concurrency.lockutils [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] Lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 677.811538] env[68964]: DEBUG nova.compute.manager [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] No waiting events found dispatching network-vif-plugged-4095f464-bbab-4645-90f0-c0ae53be0abb {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 677.811703] env[68964]: WARNING nova.compute.manager [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Received unexpected event network-vif-plugged-4095f464-bbab-4645-90f0-c0ae53be0abb for instance with vm_state building and task_state spawning. [ 677.811860] env[68964]: DEBUG nova.compute.manager [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Received event network-changed-4095f464-bbab-4645-90f0-c0ae53be0abb {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 677.812017] env[68964]: DEBUG nova.compute.manager [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Refreshing instance network info cache due to event network-changed-4095f464-bbab-4645-90f0-c0ae53be0abb. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 677.812296] env[68964]: DEBUG oslo_concurrency.lockutils [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] Acquiring lock "refresh_cache-81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 677.812330] env[68964]: DEBUG oslo_concurrency.lockutils [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] Acquired lock "refresh_cache-81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 677.812503] env[68964]: DEBUG nova.network.neutron [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Refreshing network info cache for port 4095f464-bbab-4645-90f0-c0ae53be0abb {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 677.870552] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431518, 'name': CreateVM_Task} progress is 99%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 678.366810] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431518, 'name': CreateVM_Task, 'duration_secs': 1.304199} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 678.366998] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 678.367700] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 678.367862] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 678.368194] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 678.368510] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cdc3935b-5ef7-418d-ad96-b3a8ad65e09b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 678.373059] env[68964]: DEBUG oslo_vmware.api [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Waiting for the task: (returnval){ [ 678.373059] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a00792-6184-2f9b-3826-0f8454b9d274" [ 678.373059] env[68964]: _type = "Task" [ 678.373059] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 678.380974] env[68964]: DEBUG oslo_vmware.api [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a00792-6184-2f9b-3826-0f8454b9d274, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 678.558331] env[68964]: DEBUG nova.network.neutron [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Successfully updated port: d1b6bc11-ca10-46a0-9ea5-f537a37e51bb {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 678.572731] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquiring lock "refresh_cache-e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 678.572900] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquired lock "refresh_cache-e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 678.573046] env[68964]: DEBUG nova.network.neutron [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 678.709822] env[68964]: DEBUG nova.network.neutron [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 678.884800] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 678.885459] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 678.885736] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 679.077730] env[68964]: DEBUG nova.network.neutron [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Updated VIF entry in instance network info cache for port 4095f464-bbab-4645-90f0-c0ae53be0abb. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 679.078123] env[68964]: DEBUG nova.network.neutron [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Updating instance_info_cache with network_info: [{"id": "4095f464-bbab-4645-90f0-c0ae53be0abb", "address": "fa:16:3e:ad:fe:61", "network": {"id": "e47b8814-f451-4dc8-ad9e-ee3ee5aee474", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-470765249-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c1575f9a28d41c4bcec3a22545a5d37", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "41278529-8bd2-44a1-97c8-03967faa3ff7", "external-id": "nsx-vlan-transportzone-749", "segmentation_id": 749, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4095f464-bb", "ovs_interfaceid": "4095f464-bbab-4645-90f0-c0ae53be0abb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 679.095909] env[68964]: DEBUG oslo_concurrency.lockutils [req-a62136a8-b809-4620-8b35-279facce89bd req-79eed3ac-9bc5-4c3d-88dd-605ecacd9e64 service nova] Releasing lock "refresh_cache-81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 679.384651] env[68964]: DEBUG nova.network.neutron [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Updating instance_info_cache with network_info: [{"id": "d1b6bc11-ca10-46a0-9ea5-f537a37e51bb", "address": "fa:16:3e:1a:5b:a0", "network": {"id": "2b29d8e5-b009-4503-844e-313b88618fa0", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2097658976-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a7897c354fc54193b870769dbab19c5f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0636c3f6-fcb7-4954-ab07-c5cd0dee37b0", "external-id": "nsx-vlan-transportzone-857", "segmentation_id": 857, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1b6bc11-ca", "ovs_interfaceid": "d1b6bc11-ca10-46a0-9ea5-f537a37e51bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 679.406498] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Releasing lock "refresh_cache-e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 679.406818] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Instance network_info: |[{"id": "d1b6bc11-ca10-46a0-9ea5-f537a37e51bb", "address": "fa:16:3e:1a:5b:a0", "network": {"id": "2b29d8e5-b009-4503-844e-313b88618fa0", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2097658976-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a7897c354fc54193b870769dbab19c5f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0636c3f6-fcb7-4954-ab07-c5cd0dee37b0", "external-id": "nsx-vlan-transportzone-857", "segmentation_id": 857, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1b6bc11-ca", "ovs_interfaceid": "d1b6bc11-ca10-46a0-9ea5-f537a37e51bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 679.407197] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1a:5b:a0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0636c3f6-fcb7-4954-ab07-c5cd0dee37b0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd1b6bc11-ca10-46a0-9ea5-f537a37e51bb', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 679.415339] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Creating folder: Project (a7897c354fc54193b870769dbab19c5f). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 679.416257] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4a7b2c1d-1d7c-494e-ab53-9f5153137e36 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.431916] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Created folder: Project (a7897c354fc54193b870769dbab19c5f) in parent group-v684465. [ 679.432135] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Creating folder: Instances. Parent ref: group-v684493. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 679.432380] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5fc65f9e-71e9-447d-a9ac-c1d9f5086feb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.443879] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Created folder: Instances in parent group-v684493. [ 679.443952] env[68964]: DEBUG oslo.service.loopingcall [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 679.444550] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 679.444550] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c6f41ab9-57e8-4d0f-9841-a1ea83bab54a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.468573] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 679.468573] env[68964]: value = "task-3431521" [ 679.468573] env[68964]: _type = "Task" [ 679.468573] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 679.482912] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431521, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 679.989256] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431521, 'name': CreateVM_Task, 'duration_secs': 0.372924} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 679.989256] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 679.989256] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 679.989256] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 679.989256] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 679.990545] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a2d1ef1f-7061-4240-b842-66bf63d160aa {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 679.998523] env[68964]: DEBUG oslo_vmware.api [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Waiting for the task: (returnval){ [ 679.998523] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52fe8124-b52e-805e-6d49-43c8407a41f2" [ 679.998523] env[68964]: _type = "Task" [ 679.998523] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 680.009686] env[68964]: DEBUG oslo_vmware.api [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52fe8124-b52e-805e-6d49-43c8407a41f2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 680.511650] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 680.511965] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 680.512134] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 680.784411] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquiring lock "25b453f5-0b24-4c97-9a2d-6466e1489d07" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 680.784671] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Lock "25b453f5-0b24-4c97-9a2d-6466e1489d07" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 682.131445] env[68964]: DEBUG nova.compute.manager [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Received event network-vif-plugged-481d0494-b598-4dfd-aa1d-c0e941fe26ae {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 682.131695] env[68964]: DEBUG oslo_concurrency.lockutils [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] Acquiring lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 682.131852] env[68964]: DEBUG oslo_concurrency.lockutils [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] Lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 682.132021] env[68964]: DEBUG oslo_concurrency.lockutils [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] Lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 682.132373] env[68964]: DEBUG nova.compute.manager [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] No waiting events found dispatching network-vif-plugged-481d0494-b598-4dfd-aa1d-c0e941fe26ae {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 682.132545] env[68964]: WARNING nova.compute.manager [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Received unexpected event network-vif-plugged-481d0494-b598-4dfd-aa1d-c0e941fe26ae for instance with vm_state building and task_state spawning. [ 682.132707] env[68964]: DEBUG nova.compute.manager [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Received event network-changed-481d0494-b598-4dfd-aa1d-c0e941fe26ae {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 682.132871] env[68964]: DEBUG nova.compute.manager [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Refreshing instance network info cache due to event network-changed-481d0494-b598-4dfd-aa1d-c0e941fe26ae. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 682.133144] env[68964]: DEBUG oslo_concurrency.lockutils [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] Acquiring lock "refresh_cache-5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 682.134199] env[68964]: DEBUG oslo_concurrency.lockutils [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] Acquired lock "refresh_cache-5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 682.135058] env[68964]: DEBUG nova.network.neutron [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Refreshing network info cache for port 481d0494-b598-4dfd-aa1d-c0e941fe26ae {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 682.152383] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "ff09bdbb-84e3-4182-8118-e99512a0e9de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 682.153077] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "ff09bdbb-84e3-4182-8118-e99512a0e9de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 682.466736] env[68964]: DEBUG nova.network.neutron [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Updated VIF entry in instance network info cache for port 481d0494-b598-4dfd-aa1d-c0e941fe26ae. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 682.467325] env[68964]: DEBUG nova.network.neutron [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Updating instance_info_cache with network_info: [{"id": "481d0494-b598-4dfd-aa1d-c0e941fe26ae", "address": "fa:16:3e:d6:9b:a9", "network": {"id": "d09d3627-423b-4a44-96c7-1d56d49d8b4c", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-107943923-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9318cf282d8b4a83a87c0c481f2d1011", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap481d0494-b5", "ovs_interfaceid": "481d0494-b598-4dfd-aa1d-c0e941fe26ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 682.482740] env[68964]: DEBUG oslo_concurrency.lockutils [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] Releasing lock "refresh_cache-5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 682.483032] env[68964]: DEBUG nova.compute.manager [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Received event network-vif-plugged-d1b6bc11-ca10-46a0-9ea5-f537a37e51bb {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 682.483249] env[68964]: DEBUG oslo_concurrency.lockutils [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] Acquiring lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 682.483458] env[68964]: DEBUG oslo_concurrency.lockutils [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] Lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 682.483613] env[68964]: DEBUG oslo_concurrency.lockutils [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] Lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 682.483770] env[68964]: DEBUG nova.compute.manager [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] No waiting events found dispatching network-vif-plugged-d1b6bc11-ca10-46a0-9ea5-f537a37e51bb {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 682.483927] env[68964]: WARNING nova.compute.manager [req-76cf7c16-71e2-4cf8-a244-7ca4a5a63587 req-b46a677b-8cb0-459e-a8f9-5168de49e0d2 service nova] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Received unexpected event network-vif-plugged-d1b6bc11-ca10-46a0-9ea5-f537a37e51bb for instance with vm_state building and task_state spawning. [ 683.377188] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "f86d97e4-42f3-464b-9d7b-7c05f19290ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 683.378145] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "f86d97e4-42f3-464b-9d7b-7c05f19290ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 683.807536] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquiring lock "5b2a39da-1d95-4b2d-ab0e-8440f4544ef5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 683.807831] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Lock "5b2a39da-1d95-4b2d-ab0e-8440f4544ef5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 684.554718] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquiring lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 684.555143] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 685.033847] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquiring lock "323acb55-859a-4545-a046-1934cf98be6d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 685.034092] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "323acb55-859a-4545-a046-1934cf98be6d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 685.734030] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 685.734315] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 686.671211] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d85f4462-2b04-4df3-b5a8-2e69ddc49452 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquiring lock "fa867511-44ae-47e6-8c05-5f2abf8eae88" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 686.671435] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d85f4462-2b04-4df3-b5a8-2e69ddc49452 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "fa867511-44ae-47e6-8c05-5f2abf8eae88" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 686.997036] env[68964]: DEBUG nova.compute.manager [req-dde9551d-dcf2-4dc9-8f24-c2e2b599c774 req-26884a83-5d22-41e9-baea-f5c1c5c736f6 service nova] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Received event network-changed-d1b6bc11-ca10-46a0-9ea5-f537a37e51bb {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 686.997036] env[68964]: DEBUG nova.compute.manager [req-dde9551d-dcf2-4dc9-8f24-c2e2b599c774 req-26884a83-5d22-41e9-baea-f5c1c5c736f6 service nova] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Refreshing instance network info cache due to event network-changed-d1b6bc11-ca10-46a0-9ea5-f537a37e51bb. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 686.997036] env[68964]: DEBUG oslo_concurrency.lockutils [req-dde9551d-dcf2-4dc9-8f24-c2e2b599c774 req-26884a83-5d22-41e9-baea-f5c1c5c736f6 service nova] Acquiring lock "refresh_cache-e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 686.997036] env[68964]: DEBUG oslo_concurrency.lockutils [req-dde9551d-dcf2-4dc9-8f24-c2e2b599c774 req-26884a83-5d22-41e9-baea-f5c1c5c736f6 service nova] Acquired lock "refresh_cache-e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 686.997036] env[68964]: DEBUG nova.network.neutron [req-dde9551d-dcf2-4dc9-8f24-c2e2b599c774 req-26884a83-5d22-41e9-baea-f5c1c5c736f6 service nova] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Refreshing network info cache for port d1b6bc11-ca10-46a0-9ea5-f537a37e51bb {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 687.523077] env[68964]: DEBUG nova.network.neutron [req-dde9551d-dcf2-4dc9-8f24-c2e2b599c774 req-26884a83-5d22-41e9-baea-f5c1c5c736f6 service nova] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Updated VIF entry in instance network info cache for port d1b6bc11-ca10-46a0-9ea5-f537a37e51bb. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 687.523399] env[68964]: DEBUG nova.network.neutron [req-dde9551d-dcf2-4dc9-8f24-c2e2b599c774 req-26884a83-5d22-41e9-baea-f5c1c5c736f6 service nova] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Updating instance_info_cache with network_info: [{"id": "d1b6bc11-ca10-46a0-9ea5-f537a37e51bb", "address": "fa:16:3e:1a:5b:a0", "network": {"id": "2b29d8e5-b009-4503-844e-313b88618fa0", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2097658976-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a7897c354fc54193b870769dbab19c5f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0636c3f6-fcb7-4954-ab07-c5cd0dee37b0", "external-id": "nsx-vlan-transportzone-857", "segmentation_id": 857, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1b6bc11-ca", "ovs_interfaceid": "d1b6bc11-ca10-46a0-9ea5-f537a37e51bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 687.540223] env[68964]: DEBUG oslo_concurrency.lockutils [req-dde9551d-dcf2-4dc9-8f24-c2e2b599c774 req-26884a83-5d22-41e9-baea-f5c1c5c736f6 service nova] Releasing lock "refresh_cache-e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 688.714433] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77fc965e-a8ce-4e2a-be06-01367174d032 tempest-InstanceActionsV221TestJSON-849543650 tempest-InstanceActionsV221TestJSON-849543650-project-member] Acquiring lock "888fb47f-5f48-415c-9289-61b9c42523e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 688.715931] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77fc965e-a8ce-4e2a-be06-01367174d032 tempest-InstanceActionsV221TestJSON-849543650 tempest-InstanceActionsV221TestJSON-849543650-project-member] Lock "888fb47f-5f48-415c-9289-61b9c42523e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 689.512772] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77e3bc7b-6647-4893-85cf-45f775e04cfc tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "c3ad57a5-1ea2-484a-b014-6276e0ee7914" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 689.512896] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77e3bc7b-6647-4893-85cf-45f775e04cfc tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "c3ad57a5-1ea2-484a-b014-6276e0ee7914" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 699.880499] env[68964]: WARNING oslo_vmware.rw_handles [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 699.880499] env[68964]: ERROR oslo_vmware.rw_handles [ 699.880499] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/361b6bae-a7ff-4e0f-a072-aa4e39f5931f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 699.881904] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 699.884450] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Copying Virtual Disk [datastore2] vmware_temp/361b6bae-a7ff-4e0f-a072-aa4e39f5931f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/361b6bae-a7ff-4e0f-a072-aa4e39f5931f/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 699.886016] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cc3524b3-570a-478a-808d-a1c5c4282be9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 699.893350] env[68964]: DEBUG oslo_vmware.api [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Waiting for the task: (returnval){ [ 699.893350] env[68964]: value = "task-3431522" [ 699.893350] env[68964]: _type = "Task" [ 699.893350] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 699.901922] env[68964]: DEBUG oslo_vmware.api [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Task: {'id': task-3431522, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 700.408158] env[68964]: DEBUG oslo_vmware.exceptions [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 700.408158] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 700.410733] env[68964]: ERROR nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 700.410733] env[68964]: Faults: ['InvalidArgument'] [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Traceback (most recent call last): [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] yield resources [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] self.driver.spawn(context, instance, image_meta, [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] self._vmops.spawn(context, instance, image_meta, injected_files, [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] self._fetch_image_if_missing(context, vi) [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] image_cache(vi, tmp_image_ds_loc) [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] vm_util.copy_virtual_disk( [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] session._wait_for_task(vmdk_copy_task) [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] return self.wait_for_task(task_ref) [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] return evt.wait() [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] result = hub.switch() [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] return self.greenlet.switch() [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] self.f(*self.args, **self.kw) [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] raise exceptions.translate_fault(task_info.error) [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Faults: ['InvalidArgument'] [ 700.410733] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] [ 700.411759] env[68964]: INFO nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Terminating instance [ 700.412647] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 700.412911] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 700.413425] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquiring lock "refresh_cache-2857727a-d410-4021-ae38-bc2bf6aac400" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 700.413604] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquired lock "refresh_cache-2857727a-d410-4021-ae38-bc2bf6aac400" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 700.413774] env[68964]: DEBUG nova.network.neutron [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 700.414767] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d83d93bc-011b-4dd5-a4f2-854e235a724b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.427757] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 700.427757] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 700.427757] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e0acab24-e3f4-4755-9273-8281fac601e1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.434594] env[68964]: DEBUG oslo_vmware.api [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Waiting for the task: (returnval){ [ 700.434594] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a6539a-d16c-1141-608a-bccf29ffef26" [ 700.434594] env[68964]: _type = "Task" [ 700.434594] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 700.442647] env[68964]: DEBUG oslo_vmware.api [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a6539a-d16c-1141-608a-bccf29ffef26, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 700.479114] env[68964]: DEBUG nova.network.neutron [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 700.878390] env[68964]: DEBUG nova.network.neutron [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 700.893515] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Releasing lock "refresh_cache-2857727a-d410-4021-ae38-bc2bf6aac400" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 700.893515] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 700.893515] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 700.894569] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9ba8796-93d1-42e5-813d-1691225b79e7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.904218] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 700.904380] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-23b6f616-4415-441f-9441-e9601fe07bf7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.951534] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 700.951534] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Creating directory with path [datastore2] vmware_temp/e5666b4a-ac0e-413d-ac76-335c8e9e2654/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 700.951534] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 700.951534] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 700.951919] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Deleting the datastore file [datastore2] 2857727a-d410-4021-ae38-bc2bf6aac400 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 700.952652] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1d4e3780-eab4-48c7-a68e-459243926e00 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.954440] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a88a4375-bb3c-40fa-b2f6-6acdb270052a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.962068] env[68964]: DEBUG oslo_vmware.api [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Waiting for the task: (returnval){ [ 700.962068] env[68964]: value = "task-3431524" [ 700.962068] env[68964]: _type = "Task" [ 700.962068] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 700.967705] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Created directory with path [datastore2] vmware_temp/e5666b4a-ac0e-413d-ac76-335c8e9e2654/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 700.967705] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Fetch image to [datastore2] vmware_temp/e5666b4a-ac0e-413d-ac76-335c8e9e2654/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 700.967705] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/e5666b4a-ac0e-413d-ac76-335c8e9e2654/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 700.968147] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe480206-76ad-431c-9d67-873d8317e685 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.974609] env[68964]: DEBUG oslo_vmware.api [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Task: {'id': task-3431524, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 700.979360] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67f72a36-c3f5-45b2-be21-40a7a6db5c5d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 700.989234] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a92dfc88-44cd-4cc4-bfba-f07539982e73 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.025655] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-946bcbfa-c8fc-400b-93f9-96a007afd749 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.032254] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-043de433-f4b4-4875-9f6a-32f579ef661d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.056096] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 701.127667] env[68964]: DEBUG oslo_vmware.rw_handles [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e5666b4a-ac0e-413d-ac76-335c8e9e2654/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 701.199270] env[68964]: DEBUG oslo_vmware.rw_handles [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 701.199270] env[68964]: DEBUG oslo_vmware.rw_handles [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e5666b4a-ac0e-413d-ac76-335c8e9e2654/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 701.475869] env[68964]: DEBUG oslo_vmware.api [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Task: {'id': task-3431524, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.040302} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 701.477035] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 701.477035] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 701.477035] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 701.477035] env[68964]: INFO nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Took 0.58 seconds to destroy the instance on the hypervisor. [ 701.477634] env[68964]: DEBUG oslo.service.loopingcall [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 701.477894] env[68964]: DEBUG nova.compute.manager [-] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Skipping network deallocation for instance since networking was not requested. {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 701.480334] env[68964]: DEBUG nova.compute.claims [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 701.480418] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 701.480840] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 701.830163] env[68964]: DEBUG oslo_concurrency.lockutils [None req-feef32b4-f1f8-4a9f-8191-d9b30f2f25a6 tempest-ServerExternalEventsTest-282581830 tempest-ServerExternalEventsTest-282581830-project-member] Acquiring lock "7dbca935-17b3-4a4b-ae3e-558bc802f9b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 701.830446] env[68964]: DEBUG oslo_concurrency.lockutils [None req-feef32b4-f1f8-4a9f-8191-d9b30f2f25a6 tempest-ServerExternalEventsTest-282581830 tempest-ServerExternalEventsTest-282581830-project-member] Lock "7dbca935-17b3-4a4b-ae3e-558bc802f9b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 701.900146] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-267f6de9-d18a-48e7-89fb-c7df80a123da {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.910109] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1383f14-e483-4fdb-9198-7cc79e817b5b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.946216] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae185db7-adfd-47ca-a135-506ed487156c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.954067] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f7b15ea-f1f8-49ca-b0d0-fef8e389c191 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 701.969529] env[68964]: DEBUG nova.compute.provider_tree [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 701.978844] env[68964]: DEBUG nova.scheduler.client.report [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 701.997980] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.517s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 701.998591] env[68964]: ERROR nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 701.998591] env[68964]: Faults: ['InvalidArgument'] [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Traceback (most recent call last): [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] self.driver.spawn(context, instance, image_meta, [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] self._vmops.spawn(context, instance, image_meta, injected_files, [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] self._fetch_image_if_missing(context, vi) [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] image_cache(vi, tmp_image_ds_loc) [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] vm_util.copy_virtual_disk( [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] session._wait_for_task(vmdk_copy_task) [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] return self.wait_for_task(task_ref) [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] return evt.wait() [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] result = hub.switch() [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] return self.greenlet.switch() [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] self.f(*self.args, **self.kw) [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] raise exceptions.translate_fault(task_info.error) [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Faults: ['InvalidArgument'] [ 701.998591] env[68964]: ERROR nova.compute.manager [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] [ 702.000861] env[68964]: DEBUG nova.compute.utils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 702.004548] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Build of instance 2857727a-d410-4021-ae38-bc2bf6aac400 was re-scheduled: A specified parameter was not correct: fileType [ 702.004548] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 702.006300] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 702.006300] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquiring lock "refresh_cache-2857727a-d410-4021-ae38-bc2bf6aac400" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 702.006300] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Acquired lock "refresh_cache-2857727a-d410-4021-ae38-bc2bf6aac400" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 702.006300] env[68964]: DEBUG nova.network.neutron [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 702.041255] env[68964]: DEBUG nova.network.neutron [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 702.235061] env[68964]: DEBUG nova.network.neutron [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.248350] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Releasing lock "refresh_cache-2857727a-d410-4021-ae38-bc2bf6aac400" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 702.248740] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 702.248875] env[68964]: DEBUG nova.compute.manager [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] [instance: 2857727a-d410-4021-ae38-bc2bf6aac400] Skipping network deallocation for instance since networking was not requested. {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 702.399947] env[68964]: INFO nova.scheduler.client.report [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Deleted allocations for instance 2857727a-d410-4021-ae38-bc2bf6aac400 [ 702.433501] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ae0c69b-2e7c-4fb0-8113-c8550fc6aae1 tempest-ServerDiagnosticsV248Test-1381931689 tempest-ServerDiagnosticsV248Test-1381931689-project-member] Lock "2857727a-d410-4021-ae38-bc2bf6aac400" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 51.385s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 702.471028] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 702.556559] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 702.556717] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 702.558236] env[68964]: INFO nova.compute.claims [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 702.971053] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae5b54e5-ef6d-4b3c-8f3c-f99eb99c0d30 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.980749] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97399f63-f951-4ca1-ae62-99964a1556e0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.023803] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1df4086f-81cb-4daa-a177-b958058e789a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.028295] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7da78f89-23fc-4ec5-b971-779a66dba83b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.042170] env[68964]: DEBUG nova.compute.provider_tree [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 703.051548] env[68964]: DEBUG nova.scheduler.client.report [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 703.069120] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.512s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 703.069658] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 703.118345] env[68964]: DEBUG nova.compute.utils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 703.120181] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 703.120280] env[68964]: DEBUG nova.network.neutron [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 703.141306] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 703.241822] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 703.264257] env[68964]: DEBUG nova.policy [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7daa97ee8a74ad6a11fa38385f6513e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '92573f717a0244ff864588cec3dffa1e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 703.272696] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:05:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2104627183',id=23,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-1043957195',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 703.272919] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 703.273082] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 703.273261] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 703.273399] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 703.273536] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 703.273737] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 703.273889] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 703.274058] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 703.274217] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 703.274378] env[68964]: DEBUG nova.virt.hardware [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 703.276296] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b375c11-f4eb-4e00-b0ca-32858d6fe59a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.285202] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b170d0f-4ebf-4683-b75c-323d6c587df2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.498393] env[68964]: DEBUG nova.network.neutron [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Successfully created port: 62c236e1-40d6-4bb2-8293-c131f7892f3e {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 707.189251] env[68964]: DEBUG nova.network.neutron [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Successfully updated port: 62c236e1-40d6-4bb2-8293-c131f7892f3e {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 707.208980] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquiring lock "refresh_cache-25b453f5-0b24-4c97-9a2d-6466e1489d07" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 707.209205] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquired lock "refresh_cache-25b453f5-0b24-4c97-9a2d-6466e1489d07" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 707.209303] env[68964]: DEBUG nova.network.neutron [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 707.306304] env[68964]: DEBUG nova.network.neutron [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.770579] env[68964]: DEBUG nova.network.neutron [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Updating instance_info_cache with network_info: [{"id": "62c236e1-40d6-4bb2-8293-c131f7892f3e", "address": "fa:16:3e:63:58:f7", "network": {"id": "e797e2a4-d152-43bc-add2-0c82c2190504", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1892593703-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "92573f717a0244ff864588cec3dffa1e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd77ecbc-aaaf-45f4-ae8f-977d90e4052f", "external-id": "nsx-vlan-transportzone-171", "segmentation_id": 171, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62c236e1-40", "ovs_interfaceid": "62c236e1-40d6-4bb2-8293-c131f7892f3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.783812] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Releasing lock "refresh_cache-25b453f5-0b24-4c97-9a2d-6466e1489d07" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 707.783937] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Instance network_info: |[{"id": "62c236e1-40d6-4bb2-8293-c131f7892f3e", "address": "fa:16:3e:63:58:f7", "network": {"id": "e797e2a4-d152-43bc-add2-0c82c2190504", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1892593703-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "92573f717a0244ff864588cec3dffa1e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd77ecbc-aaaf-45f4-ae8f-977d90e4052f", "external-id": "nsx-vlan-transportzone-171", "segmentation_id": 171, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62c236e1-40", "ovs_interfaceid": "62c236e1-40d6-4bb2-8293-c131f7892f3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 707.784293] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:63:58:f7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'fd77ecbc-aaaf-45f4-ae8f-977d90e4052f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '62c236e1-40d6-4bb2-8293-c131f7892f3e', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 707.793477] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Creating folder: Project (92573f717a0244ff864588cec3dffa1e). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 707.794108] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fa9ba00c-6702-446e-8493-5dc3dc13f62c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.805895] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Created folder: Project (92573f717a0244ff864588cec3dffa1e) in parent group-v684465. [ 707.805895] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Creating folder: Instances. Parent ref: group-v684499. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 707.805895] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-25d302e7-2b74-4a23-837b-46013ca9d82e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.817953] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Created folder: Instances in parent group-v684499. [ 707.817953] env[68964]: DEBUG oslo.service.loopingcall [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 707.817953] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 707.818285] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1bd772c0-771a-43b1-b6a4-20cc6c55a42e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.842259] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 707.842259] env[68964]: value = "task-3431532" [ 707.842259] env[68964]: _type = "Task" [ 707.842259] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 707.850836] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431532, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 707.979432] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "f37297e4-80b0-4c1d-b427-6a6a235bdc57" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 707.979432] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "f37297e4-80b0-4c1d-b427-6a6a235bdc57" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 708.353578] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431532, 'name': CreateVM_Task, 'duration_secs': 0.372079} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 708.353888] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 708.354890] env[68964]: DEBUG oslo_vmware.service [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-723ec724-4f0f-45c5-b68f-5586ec87a56b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.361918] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 708.362117] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 708.362484] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 708.362746] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a080977c-4fca-4361-aec6-2a9e04a60252 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.368264] env[68964]: DEBUG oslo_vmware.api [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Waiting for the task: (returnval){ [ 708.368264] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]529e0340-47b4-1e78-842b-fac82626d739" [ 708.368264] env[68964]: _type = "Task" [ 708.368264] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 708.376580] env[68964]: DEBUG oslo_vmware.api [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]529e0340-47b4-1e78-842b-fac82626d739, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 708.886414] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 708.886872] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 708.887251] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 708.888586] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 708.888586] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 708.888586] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bdc0c05d-9a0f-49da-8672-5706e10df2eb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.900025] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 708.900025] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 708.900546] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ad43152-dbbd-49ee-a83e-f2f548f9c7dd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.908714] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-49fda168-d797-4aa1-93de-079481b079cc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.916732] env[68964]: DEBUG oslo_vmware.api [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Waiting for the task: (returnval){ [ 708.916732] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52571947-eefd-9a11-7e8a-4d5a1cdfb69d" [ 708.916732] env[68964]: _type = "Task" [ 708.916732] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 708.929762] env[68964]: DEBUG oslo_vmware.api [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52571947-eefd-9a11-7e8a-4d5a1cdfb69d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 708.983173] env[68964]: DEBUG nova.compute.manager [req-6ec7f96b-38ea-41fc-8d3e-2e9a620f6b90 req-057d5572-c570-4453-a13f-0b7032144f00 service nova] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Received event network-vif-plugged-62c236e1-40d6-4bb2-8293-c131f7892f3e {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 708.983403] env[68964]: DEBUG oslo_concurrency.lockutils [req-6ec7f96b-38ea-41fc-8d3e-2e9a620f6b90 req-057d5572-c570-4453-a13f-0b7032144f00 service nova] Acquiring lock "25b453f5-0b24-4c97-9a2d-6466e1489d07-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 708.983809] env[68964]: DEBUG oslo_concurrency.lockutils [req-6ec7f96b-38ea-41fc-8d3e-2e9a620f6b90 req-057d5572-c570-4453-a13f-0b7032144f00 service nova] Lock "25b453f5-0b24-4c97-9a2d-6466e1489d07-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 708.983809] env[68964]: DEBUG oslo_concurrency.lockutils [req-6ec7f96b-38ea-41fc-8d3e-2e9a620f6b90 req-057d5572-c570-4453-a13f-0b7032144f00 service nova] Lock "25b453f5-0b24-4c97-9a2d-6466e1489d07-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 708.983955] env[68964]: DEBUG nova.compute.manager [req-6ec7f96b-38ea-41fc-8d3e-2e9a620f6b90 req-057d5572-c570-4453-a13f-0b7032144f00 service nova] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] No waiting events found dispatching network-vif-plugged-62c236e1-40d6-4bb2-8293-c131f7892f3e {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 708.985031] env[68964]: WARNING nova.compute.manager [req-6ec7f96b-38ea-41fc-8d3e-2e9a620f6b90 req-057d5572-c570-4453-a13f-0b7032144f00 service nova] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Received unexpected event network-vif-plugged-62c236e1-40d6-4bb2-8293-c131f7892f3e for instance with vm_state building and task_state spawning. [ 709.363126] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a957c0b7-5996-4d87-a63f-305c20176b3a tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "8370a744-2602-410e-a509-e8487810e266" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 709.363480] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a957c0b7-5996-4d87-a63f-305c20176b3a tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "8370a744-2602-410e-a509-e8487810e266" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 709.428779] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 709.428962] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Creating directory with path [datastore1] vmware_temp/a78a1ed1-197d-4e75-b6bf-3e930806a1c3/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 709.429624] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-88cfdf88-46b9-4647-a030-f6f79fd7f7f7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.452985] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Created directory with path [datastore1] vmware_temp/a78a1ed1-197d-4e75-b6bf-3e930806a1c3/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 709.452985] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Fetch image to [datastore1] vmware_temp/a78a1ed1-197d-4e75-b6bf-3e930806a1c3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 709.452985] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/a78a1ed1-197d-4e75-b6bf-3e930806a1c3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 709.452985] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d185b1df-37c6-49b2-a53f-e53eab87ef03 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.465133] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f13ac86-46a7-46bd-ac00-1888eccab500 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.481166] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cac40daf-9491-4705-b9ae-7d14888034be {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.520981] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8035ef7-f6f6-46ad-b2a5-b14d94218d74 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.528193] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d55ec721-b259-47b8-97bf-a05c20c0cc27 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.558089] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 709.628941] env[68964]: DEBUG oslo_vmware.rw_handles [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a78a1ed1-197d-4e75-b6bf-3e930806a1c3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 709.694713] env[68964]: DEBUG oslo_vmware.rw_handles [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 709.694899] env[68964]: DEBUG oslo_vmware.rw_handles [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a78a1ed1-197d-4e75-b6bf-3e930806a1c3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 711.626524] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9eb9ae95-cb11-4471-9cea-3940f4fa39dc tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "d55ab849-9aa1-4d65-9a30-7c5eeb77d9a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 711.627457] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9eb9ae95-cb11-4471-9cea-3940f4fa39dc tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "d55ab849-9aa1-4d65-9a30-7c5eeb77d9a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 712.352795] env[68964]: DEBUG nova.compute.manager [req-db583984-8b88-4e84-bcb7-cc23e77f260a req-41bcc2a3-3323-4d51-91dc-399d71ec7406 service nova] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Received event network-changed-62c236e1-40d6-4bb2-8293-c131f7892f3e {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 712.356235] env[68964]: DEBUG nova.compute.manager [req-db583984-8b88-4e84-bcb7-cc23e77f260a req-41bcc2a3-3323-4d51-91dc-399d71ec7406 service nova] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Refreshing instance network info cache due to event network-changed-62c236e1-40d6-4bb2-8293-c131f7892f3e. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 712.356235] env[68964]: DEBUG oslo_concurrency.lockutils [req-db583984-8b88-4e84-bcb7-cc23e77f260a req-41bcc2a3-3323-4d51-91dc-399d71ec7406 service nova] Acquiring lock "refresh_cache-25b453f5-0b24-4c97-9a2d-6466e1489d07" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 712.356235] env[68964]: DEBUG oslo_concurrency.lockutils [req-db583984-8b88-4e84-bcb7-cc23e77f260a req-41bcc2a3-3323-4d51-91dc-399d71ec7406 service nova] Acquired lock "refresh_cache-25b453f5-0b24-4c97-9a2d-6466e1489d07" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 712.356235] env[68964]: DEBUG nova.network.neutron [req-db583984-8b88-4e84-bcb7-cc23e77f260a req-41bcc2a3-3323-4d51-91dc-399d71ec7406 service nova] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Refreshing network info cache for port 62c236e1-40d6-4bb2-8293-c131f7892f3e {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 713.140105] env[68964]: DEBUG nova.network.neutron [req-db583984-8b88-4e84-bcb7-cc23e77f260a req-41bcc2a3-3323-4d51-91dc-399d71ec7406 service nova] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Updated VIF entry in instance network info cache for port 62c236e1-40d6-4bb2-8293-c131f7892f3e. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 713.141636] env[68964]: DEBUG nova.network.neutron [req-db583984-8b88-4e84-bcb7-cc23e77f260a req-41bcc2a3-3323-4d51-91dc-399d71ec7406 service nova] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Updating instance_info_cache with network_info: [{"id": "62c236e1-40d6-4bb2-8293-c131f7892f3e", "address": "fa:16:3e:63:58:f7", "network": {"id": "e797e2a4-d152-43bc-add2-0c82c2190504", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1892593703-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "92573f717a0244ff864588cec3dffa1e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "fd77ecbc-aaaf-45f4-ae8f-977d90e4052f", "external-id": "nsx-vlan-transportzone-171", "segmentation_id": 171, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62c236e1-40", "ovs_interfaceid": "62c236e1-40d6-4bb2-8293-c131f7892f3e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.157538] env[68964]: DEBUG oslo_concurrency.lockutils [req-db583984-8b88-4e84-bcb7-cc23e77f260a req-41bcc2a3-3323-4d51-91dc-399d71ec7406 service nova] Releasing lock "refresh_cache-25b453f5-0b24-4c97-9a2d-6466e1489d07" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 716.780450] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8525e0d5-9df0-4466-b444-d8e0a48204f6 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "0c909272-30a0-40b7-ad1d-90933925ff6f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 716.781051] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8525e0d5-9df0-4466-b444-d8e0a48204f6 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "0c909272-30a0-40b7-ad1d-90933925ff6f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 719.042600] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9f788767-3e9d-452b-8dc1-95d934f9f408 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "3005e937-65d2-4e41-8dd7-2fecaaa15365" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 719.043432] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9f788767-3e9d-452b-8dc1-95d934f9f408 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "3005e937-65d2-4e41-8dd7-2fecaaa15365" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 719.725316] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquiring lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 719.725316] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 720.183933] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 720.215047] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 720.215047] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 720.215047] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 720.215047] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 720.234435] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 720.234669] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 720.234838] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 720.234998] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 720.236275] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e710fe9-f494-4b6f-bbfc-5ba9ee8fdaae {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.245136] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eb802a5-23f6-4dfb-8e8f-c23964fede75 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.263008] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78645e2e-9375-432c-92b7-cf8bfb535f73 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.270614] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31b70940-663c-48a5-a2df-0b247edc6220 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.305949] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180935MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 720.306136] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 720.306358] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 720.378043] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4da35ed9-8646-45b8-b66f-715195a405a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.378225] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.378356] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 617f46a1-ca50-4561-9d0f-a596e35bf26d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.378480] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ed73ed7d-e299-472a-805c-32bf83e96f8d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.378597] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.378715] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.378829] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.378942] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 329835df-cb38-495e-8a0e-539a396ddc74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.379069] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.379402] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 25b453f5-0b24-4c97-9a2d-6466e1489d07 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 720.406747] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ff09bdbb-84e3-4182-8118-e99512a0e9de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.433866] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f86d97e4-42f3-464b-9d7b-7c05f19290ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.446601] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.459141] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cd99acb8-f78c-4c03-8c2e-2e9d50d18969 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.469019] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 323acb55-859a-4545-a046-1934cf98be6d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.480169] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fe587d8a-aa99-4163-a2a2-a97c0cfbb82a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.489713] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fa867511-44ae-47e6-8c05-5f2abf8eae88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.503056] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 888fb47f-5f48-415c-9289-61b9c42523e5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.517677] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c3ad57a5-1ea2-484a-b014-6276e0ee7914 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.528712] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7dbca935-17b3-4a4b-ae3e-558bc802f9b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.539361] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f37297e4-80b0-4c1d-b427-6a6a235bdc57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.553800] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8370a744-2602-410e-a509-e8487810e266 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.564349] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance d55ab849-9aa1-4d65-9a30-7c5eeb77d9a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.575616] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 0c909272-30a0-40b7-ad1d-90933925ff6f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.588179] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3005e937-65d2-4e41-8dd7-2fecaaa15365 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.604114] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 720.604581] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 720.604581] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 720.988026] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a1f014-44bc-4e5e-ae6d-7327e61683f4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.996813] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e416b9cd-bbb4-4e41-8a8c-42237846f25c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.034040] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f50127e-3ff7-4553-9272-b56cfd804344 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.045243] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e413c6e2-18d0-44f1-86de-c1d6103d8a8b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.064271] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 721.098950] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 721.124725] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 721.124926] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.819s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 721.365030] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ada3d0bd-f797-4749-96da-db4046c9ae04 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] Acquiring lock "7c21c92e-16ed-4e2c-90d5-9391b1eeb703" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 721.365446] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ada3d0bd-f797-4749-96da-db4046c9ae04 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] Lock "7c21c92e-16ed-4e2c-90d5-9391b1eeb703" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 721.637176] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 721.637176] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 721.725282] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 721.725282] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 721.725282] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 721.754908] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.755139] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.755225] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.755373] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.755531] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.755712] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.755844] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.755966] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.756135] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.756259] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 721.756377] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 721.756929] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 721.757147] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 721.757301] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 722.844357] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efda74b2-3b38-4e8f-bdf0-bd6d7b1206ce tempest-AttachInterfacesUnderV243Test-1445261461 tempest-AttachInterfacesUnderV243Test-1445261461-project-member] Acquiring lock "749fc36c-c3de-4762-bae7-515dec3c7377" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 722.844619] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efda74b2-3b38-4e8f-bdf0-bd6d7b1206ce tempest-AttachInterfacesUnderV243Test-1445261461 tempest-AttachInterfacesUnderV243Test-1445261461-project-member] Lock "749fc36c-c3de-4762-bae7-515dec3c7377" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 726.474765] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cdc17b90-87c8-4f1b-b229-fc98d44ab136 tempest-ServerActionsV293TestJSON-572515950 tempest-ServerActionsV293TestJSON-572515950-project-member] Acquiring lock "04e18d39-9cf6-4c0e-ae33-29e955827571" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 726.476924] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cdc17b90-87c8-4f1b-b229-fc98d44ab136 tempest-ServerActionsV293TestJSON-572515950 tempest-ServerActionsV293TestJSON-572515950-project-member] Lock "04e18d39-9cf6-4c0e-ae33-29e955827571" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 728.139828] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f45a791f-d01e-45aa-957f-95b8a08e9b19 tempest-ImagesNegativeTestJSON-484771123 tempest-ImagesNegativeTestJSON-484771123-project-member] Acquiring lock "92c1d7af-79e0-4cd9-a7e5-a969b4843778" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 728.140752] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f45a791f-d01e-45aa-957f-95b8a08e9b19 tempest-ImagesNegativeTestJSON-484771123 tempest-ImagesNegativeTestJSON-484771123-project-member] Lock "92c1d7af-79e0-4cd9-a7e5-a969b4843778" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 734.023371] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f40ee082-f03a-44f5-bad3-1c2692051779 tempest-ServerActionsTestOtherA-473204481 tempest-ServerActionsTestOtherA-473204481-project-member] Acquiring lock "4726af42-5678-4b56-8675-76e30156feaa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 734.023715] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f40ee082-f03a-44f5-bad3-1c2692051779 tempest-ServerActionsTestOtherA-473204481 tempest-ServerActionsTestOtherA-473204481-project-member] Lock "4726af42-5678-4b56-8675-76e30156feaa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 734.381031] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ea3d12d-8d1c-45c4-ae3c-3e452436cd1c tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] Acquiring lock "eb007e0d-124f-4ef6-85d7-c68b310e8b9f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 734.381128] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ea3d12d-8d1c-45c4-ae3c-3e452436cd1c tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] Lock "eb007e0d-124f-4ef6-85d7-c68b310e8b9f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 734.801043] env[68964]: DEBUG oslo_concurrency.lockutils [None req-54b22aed-4387-4158-86c3-736a2e0cf3ec tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] Acquiring lock "58afa2a4-da8c-4b32-9c76-587d082de444" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 734.801284] env[68964]: DEBUG oslo_concurrency.lockutils [None req-54b22aed-4387-4158-86c3-736a2e0cf3ec tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] Lock "58afa2a4-da8c-4b32-9c76-587d082de444" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 746.424340] env[68964]: WARNING oslo_vmware.rw_handles [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 746.424340] env[68964]: ERROR oslo_vmware.rw_handles [ 746.424834] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/e5666b4a-ac0e-413d-ac76-335c8e9e2654/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 746.426500] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 746.426826] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Copying Virtual Disk [datastore2] vmware_temp/e5666b4a-ac0e-413d-ac76-335c8e9e2654/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/e5666b4a-ac0e-413d-ac76-335c8e9e2654/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 746.427042] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1e31b909-1bf9-4fbd-9967-e01d93e1f8da {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.436571] env[68964]: DEBUG oslo_vmware.api [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Waiting for the task: (returnval){ [ 746.436571] env[68964]: value = "task-3431539" [ 746.436571] env[68964]: _type = "Task" [ 746.436571] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 746.444548] env[68964]: DEBUG oslo_vmware.api [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Task: {'id': task-3431539, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 746.947913] env[68964]: DEBUG oslo_vmware.exceptions [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 746.948233] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 746.948791] env[68964]: ERROR nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 746.948791] env[68964]: Faults: ['InvalidArgument'] [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Traceback (most recent call last): [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] yield resources [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] self.driver.spawn(context, instance, image_meta, [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] self._fetch_image_if_missing(context, vi) [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] image_cache(vi, tmp_image_ds_loc) [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] vm_util.copy_virtual_disk( [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] session._wait_for_task(vmdk_copy_task) [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] return self.wait_for_task(task_ref) [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] return evt.wait() [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] result = hub.switch() [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] return self.greenlet.switch() [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] self.f(*self.args, **self.kw) [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] raise exceptions.translate_fault(task_info.error) [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Faults: ['InvalidArgument'] [ 746.948791] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] [ 746.949876] env[68964]: INFO nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Terminating instance [ 746.950635] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 746.950835] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 746.951077] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-48198da6-b71b-445a-953c-08d7bfcfeda4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.953257] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 746.953453] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 746.954215] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17e948d4-fc08-45b4-9d0b-ea2f05567764 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.961489] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 746.962520] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3d8cd93c-58b8-4c96-9089-1dca692c3205 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.963890] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 746.964074] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 746.964728] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-758639a3-2c39-42c3-83ac-5eb409202913 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 746.970027] env[68964]: DEBUG oslo_vmware.api [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for the task: (returnval){ [ 746.970027] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]528353c7-cc54-d3d1-c38e-c51a47e87c61" [ 746.970027] env[68964]: _type = "Task" [ 746.970027] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 746.977596] env[68964]: DEBUG oslo_vmware.api [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]528353c7-cc54-d3d1-c38e-c51a47e87c61, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 747.032908] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 747.033139] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 747.033319] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Deleting the datastore file [datastore2] 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 747.033596] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-126d4b2b-d4a8-4b0e-8d6e-53b174daceed {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.039568] env[68964]: DEBUG oslo_vmware.api [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Waiting for the task: (returnval){ [ 747.039568] env[68964]: value = "task-3431541" [ 747.039568] env[68964]: _type = "Task" [ 747.039568] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 747.047268] env[68964]: DEBUG oslo_vmware.api [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Task: {'id': task-3431541, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 747.480629] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 747.480922] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Creating directory with path [datastore2] vmware_temp/885defea-512d-4ac0-9b72-60165c4ba7cc/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 747.481123] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9729c521-b5b9-45a0-b859-3eb2b2a52c34 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.492208] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Created directory with path [datastore2] vmware_temp/885defea-512d-4ac0-9b72-60165c4ba7cc/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 747.492409] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Fetch image to [datastore2] vmware_temp/885defea-512d-4ac0-9b72-60165c4ba7cc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 747.492578] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/885defea-512d-4ac0-9b72-60165c4ba7cc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 747.493359] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a090099-cbc0-48a8-ab8b-d27a443bf39a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.499941] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc0390d8-6da9-4b6c-998f-a242864dbbb1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.508996] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52d7ff7b-e918-482a-a5d5-7c0c68bb4d16 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.544193] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb07367b-a1eb-4946-8115-78e17c0a468f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.551303] env[68964]: DEBUG oslo_vmware.api [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Task: {'id': task-3431541, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075171} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 747.552761] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 747.552952] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 747.553137] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 747.553309] env[68964]: INFO nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 747.555151] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-45b79229-d6b7-4448-b01f-50e824bd6b38 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 747.557123] env[68964]: DEBUG nova.compute.claims [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 747.557299] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 747.557510] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 747.582727] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 747.636075] env[68964]: DEBUG oslo_vmware.rw_handles [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/885defea-512d-4ac0-9b72-60165c4ba7cc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 747.693904] env[68964]: DEBUG oslo_vmware.rw_handles [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 747.694105] env[68964]: DEBUG oslo_vmware.rw_handles [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/885defea-512d-4ac0-9b72-60165c4ba7cc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 748.036286] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-396f7273-3037-49eb-b746-384fb4663fb3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.044632] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60a66b36-d553-47a0-9da7-1d0a6934bb4e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.076159] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86c679f1-a952-4036-846d-d6147028e571 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.084022] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66870770-20ce-4fce-818a-a9600e191f16 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 748.098136] env[68964]: DEBUG nova.compute.provider_tree [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 748.106589] env[68964]: DEBUG nova.scheduler.client.report [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 748.120132] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.562s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 748.120651] env[68964]: ERROR nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 748.120651] env[68964]: Faults: ['InvalidArgument'] [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Traceback (most recent call last): [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] self.driver.spawn(context, instance, image_meta, [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] self._fetch_image_if_missing(context, vi) [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] image_cache(vi, tmp_image_ds_loc) [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] vm_util.copy_virtual_disk( [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] session._wait_for_task(vmdk_copy_task) [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] return self.wait_for_task(task_ref) [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] return evt.wait() [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] result = hub.switch() [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] return self.greenlet.switch() [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] self.f(*self.args, **self.kw) [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] raise exceptions.translate_fault(task_info.error) [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Faults: ['InvalidArgument'] [ 748.120651] env[68964]: ERROR nova.compute.manager [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] [ 748.121523] env[68964]: DEBUG nova.compute.utils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 748.122668] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Build of instance 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9 was re-scheduled: A specified parameter was not correct: fileType [ 748.122668] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 748.123556] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 748.123556] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 748.123556] env[68964]: DEBUG nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 748.123556] env[68964]: DEBUG nova.network.neutron [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 748.694601] env[68964]: DEBUG nova.network.neutron [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.704237] env[68964]: INFO nova.compute.manager [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] [instance: 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9] Took 0.58 seconds to deallocate network for instance. [ 748.801047] env[68964]: INFO nova.scheduler.client.report [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Deleted allocations for instance 4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9 [ 748.822027] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5d101e4a-ffb7-4831-9c40-e100c514c103 tempest-ServerDiagnosticsNegativeTest-689371084 tempest-ServerDiagnosticsNegativeTest-689371084-project-member] Lock "4ebb69a3-c8c3-40cd-9e76-ed5249d61fb9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 96.162s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 748.854895] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 748.908799] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 748.908890] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 748.911255] env[68964]: INFO nova.compute.claims [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 749.343905] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bf3ea96-2ffb-4cad-b325-d66acd99a814 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 749.351797] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92613c9c-9c9b-4677-bcd7-49808e02abb9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 749.382768] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fda07b8a-c906-40d7-b167-0c8469bd6924 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 749.389676] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77e52757-bf96-4eaf-b0e0-a9044a65a96b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 749.402734] env[68964]: DEBUG nova.compute.provider_tree [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 749.411306] env[68964]: DEBUG nova.scheduler.client.report [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 749.425595] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.517s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 749.426085] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 749.458134] env[68964]: DEBUG nova.compute.utils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 749.460230] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 749.460437] env[68964]: DEBUG nova.network.neutron [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 749.468776] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 749.525306] env[68964]: DEBUG nova.policy [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e16c5efbf3634d039bf57dc8feafcb56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0284577172914b56b74ece100e1584e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 749.541318] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 749.578763] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 749.578921] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 749.579059] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 749.579236] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 749.579375] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 749.579514] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 749.579717] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 749.579906] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 749.580305] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 749.580637] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 749.582220] env[68964]: DEBUG nova.virt.hardware [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 749.584307] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ba9347f-5946-4bdc-bb49-cf06356ff82a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 749.595727] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3add30f-4297-4b15-8630-304096baddf8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 749.938962] env[68964]: DEBUG nova.network.neutron [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Successfully created port: 62cf7237-3631-4fca-8fea-7abd44b162f5 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 750.838377] env[68964]: DEBUG nova.network.neutron [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Successfully updated port: 62cf7237-3631-4fca-8fea-7abd44b162f5 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 750.857281] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "refresh_cache-ff09bdbb-84e3-4182-8118-e99512a0e9de" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 750.857456] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquired lock "refresh_cache-ff09bdbb-84e3-4182-8118-e99512a0e9de" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 750.857655] env[68964]: DEBUG nova.network.neutron [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 750.916397] env[68964]: DEBUG nova.network.neutron [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.950100] env[68964]: DEBUG nova.compute.manager [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Received event network-vif-plugged-62cf7237-3631-4fca-8fea-7abd44b162f5 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 750.950351] env[68964]: DEBUG oslo_concurrency.lockutils [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] Acquiring lock "ff09bdbb-84e3-4182-8118-e99512a0e9de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 750.950521] env[68964]: DEBUG oslo_concurrency.lockutils [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] Lock "ff09bdbb-84e3-4182-8118-e99512a0e9de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 750.950691] env[68964]: DEBUG oslo_concurrency.lockutils [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] Lock "ff09bdbb-84e3-4182-8118-e99512a0e9de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 750.950905] env[68964]: DEBUG nova.compute.manager [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] No waiting events found dispatching network-vif-plugged-62cf7237-3631-4fca-8fea-7abd44b162f5 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 750.951445] env[68964]: WARNING nova.compute.manager [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Received unexpected event network-vif-plugged-62cf7237-3631-4fca-8fea-7abd44b162f5 for instance with vm_state building and task_state spawning. [ 750.951703] env[68964]: DEBUG nova.compute.manager [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Received event network-changed-62cf7237-3631-4fca-8fea-7abd44b162f5 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 750.951905] env[68964]: DEBUG nova.compute.manager [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Refreshing instance network info cache due to event network-changed-62cf7237-3631-4fca-8fea-7abd44b162f5. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 750.952150] env[68964]: DEBUG oslo_concurrency.lockutils [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] Acquiring lock "refresh_cache-ff09bdbb-84e3-4182-8118-e99512a0e9de" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 751.142391] env[68964]: DEBUG nova.network.neutron [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Updating instance_info_cache with network_info: [{"id": "62cf7237-3631-4fca-8fea-7abd44b162f5", "address": "fa:16:3e:bb:c1:9e", "network": {"id": "a54b2f1d-5db7-4c69-b6fe-c1721675a5aa", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1034871949-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0284577172914b56b74ece100e1584e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1f996252-e329-42bd-a897-446dfe2b81cd", "external-id": "nsx-vlan-transportzone-535", "segmentation_id": 535, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62cf7237-36", "ovs_interfaceid": "62cf7237-3631-4fca-8fea-7abd44b162f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.160362] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Releasing lock "refresh_cache-ff09bdbb-84e3-4182-8118-e99512a0e9de" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 751.160674] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Instance network_info: |[{"id": "62cf7237-3631-4fca-8fea-7abd44b162f5", "address": "fa:16:3e:bb:c1:9e", "network": {"id": "a54b2f1d-5db7-4c69-b6fe-c1721675a5aa", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1034871949-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0284577172914b56b74ece100e1584e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1f996252-e329-42bd-a897-446dfe2b81cd", "external-id": "nsx-vlan-transportzone-535", "segmentation_id": 535, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62cf7237-36", "ovs_interfaceid": "62cf7237-3631-4fca-8fea-7abd44b162f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 751.160978] env[68964]: DEBUG oslo_concurrency.lockutils [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] Acquired lock "refresh_cache-ff09bdbb-84e3-4182-8118-e99512a0e9de" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 751.161176] env[68964]: DEBUG nova.network.neutron [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Refreshing network info cache for port 62cf7237-3631-4fca-8fea-7abd44b162f5 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 751.162226] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bb:c1:9e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1f996252-e329-42bd-a897-446dfe2b81cd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '62cf7237-3631-4fca-8fea-7abd44b162f5', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 751.169913] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Creating folder: Project (0284577172914b56b74ece100e1584e3). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 751.170632] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f740c3d3-a9e7-4fd8-97d8-3a8056907467 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.183916] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Created folder: Project (0284577172914b56b74ece100e1584e3) in parent group-v684465. [ 751.184123] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Creating folder: Instances. Parent ref: group-v684503. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 751.184349] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8366b553-a100-495f-9f21-4479a91219e8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.193144] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Created folder: Instances in parent group-v684503. [ 751.193374] env[68964]: DEBUG oslo.service.loopingcall [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 751.193550] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 751.193772] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8210cd61-087d-43a5-bf75-b147aa41e26b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.218606] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 751.218606] env[68964]: value = "task-3431544" [ 751.218606] env[68964]: _type = "Task" [ 751.218606] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 751.226569] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431544, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 751.647756] env[68964]: DEBUG nova.network.neutron [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Updated VIF entry in instance network info cache for port 62cf7237-3631-4fca-8fea-7abd44b162f5. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 751.648131] env[68964]: DEBUG nova.network.neutron [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Updating instance_info_cache with network_info: [{"id": "62cf7237-3631-4fca-8fea-7abd44b162f5", "address": "fa:16:3e:bb:c1:9e", "network": {"id": "a54b2f1d-5db7-4c69-b6fe-c1721675a5aa", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1034871949-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0284577172914b56b74ece100e1584e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1f996252-e329-42bd-a897-446dfe2b81cd", "external-id": "nsx-vlan-transportzone-535", "segmentation_id": 535, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62cf7237-36", "ovs_interfaceid": "62cf7237-3631-4fca-8fea-7abd44b162f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.663338] env[68964]: DEBUG oslo_concurrency.lockutils [req-003ac396-2ede-446d-9bcd-10483fde8fe4 req-2035c331-048a-4561-a1f7-bfaf24f98b36 service nova] Releasing lock "refresh_cache-ff09bdbb-84e3-4182-8118-e99512a0e9de" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 751.729420] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431544, 'name': CreateVM_Task, 'duration_secs': 0.294687} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 751.729610] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 751.730293] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 751.730458] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 751.730773] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 751.731040] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-be318481-80a3-43d3-979d-dfe579b048dd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.736300] env[68964]: DEBUG oslo_vmware.api [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for the task: (returnval){ [ 751.736300] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52796236-38ae-a80a-ef06-e6c81edd1d8f" [ 751.736300] env[68964]: _type = "Task" [ 751.736300] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 751.744952] env[68964]: DEBUG oslo_vmware.api [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52796236-38ae-a80a-ef06-e6c81edd1d8f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 752.250021] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 752.250021] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 752.250021] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 755.388269] env[68964]: DEBUG oslo_concurrency.lockutils [None req-df9132b3-9f33-4942-8b4e-b722eb1320d5 tempest-ImagesOneServerTestJSON-1548653576 tempest-ImagesOneServerTestJSON-1548653576-project-member] Acquiring lock "722f8bf7-1634-4190-9cc0-49b2a28c367e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 755.388575] env[68964]: DEBUG oslo_concurrency.lockutils [None req-df9132b3-9f33-4942-8b4e-b722eb1320d5 tempest-ImagesOneServerTestJSON-1548653576 tempest-ImagesOneServerTestJSON-1548653576-project-member] Lock "722f8bf7-1634-4190-9cc0-49b2a28c367e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 758.713615] env[68964]: WARNING oslo_vmware.rw_handles [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 758.713615] env[68964]: ERROR oslo_vmware.rw_handles [ 758.714148] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/a78a1ed1-197d-4e75-b6bf-3e930806a1c3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 758.715398] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 758.715644] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Copying Virtual Disk [datastore1] vmware_temp/a78a1ed1-197d-4e75-b6bf-3e930806a1c3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/a78a1ed1-197d-4e75-b6bf-3e930806a1c3/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 758.715970] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9ec79fb2-09cd-4a11-ae44-d40092e4bb6e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 758.724353] env[68964]: DEBUG oslo_vmware.api [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Waiting for the task: (returnval){ [ 758.724353] env[68964]: value = "task-3431545" [ 758.724353] env[68964]: _type = "Task" [ 758.724353] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 758.732314] env[68964]: DEBUG oslo_vmware.api [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Task: {'id': task-3431545, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 759.235068] env[68964]: DEBUG oslo_vmware.exceptions [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 759.235505] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 759.236167] env[68964]: ERROR nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 759.236167] env[68964]: Faults: ['InvalidArgument'] [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Traceback (most recent call last): [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] yield resources [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] self.driver.spawn(context, instance, image_meta, [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] self._vmops.spawn(context, instance, image_meta, injected_files, [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] self._fetch_image_if_missing(context, vi) [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] image_cache(vi, tmp_image_ds_loc) [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] vm_util.copy_virtual_disk( [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] session._wait_for_task(vmdk_copy_task) [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] return self.wait_for_task(task_ref) [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] return evt.wait() [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] result = hub.switch() [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] return self.greenlet.switch() [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] self.f(*self.args, **self.kw) [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] raise exceptions.translate_fault(task_info.error) [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Faults: ['InvalidArgument'] [ 759.236167] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] [ 759.236984] env[68964]: INFO nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Terminating instance [ 759.238810] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 759.239019] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 759.239751] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-551b9d72-0d88-4c91-b76a-13c7c07d786c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.246344] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 759.246566] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fb7186b3-9fe2-487d-a6b1-08cd56f21d90 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.309054] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 759.309291] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 759.309470] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Deleting the datastore file [datastore1] 25b453f5-0b24-4c97-9a2d-6466e1489d07 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 759.309813] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-eba64bd6-7eb8-426f-8000-bd0bc4652312 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 759.316433] env[68964]: DEBUG oslo_vmware.api [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Waiting for the task: (returnval){ [ 759.316433] env[68964]: value = "task-3431547" [ 759.316433] env[68964]: _type = "Task" [ 759.316433] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 759.323869] env[68964]: DEBUG oslo_vmware.api [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Task: {'id': task-3431547, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 759.826923] env[68964]: DEBUG oslo_vmware.api [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Task: {'id': task-3431547, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066357} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 759.827257] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 759.828024] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 759.828024] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 759.828024] env[68964]: INFO nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Took 0.59 seconds to destroy the instance on the hypervisor. [ 759.829901] env[68964]: DEBUG nova.compute.claims [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 759.830087] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 759.830302] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 760.235473] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-400afa78-41f2-4437-8ff7-f548b3f42706 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.244732] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-776b4884-e796-46f4-bf4e-9cb98b09fbdd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.275095] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72c07b83-4d21-4005-b978-3ffb4c810bce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.281326] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dffce20-e42d-43ac-863c-dffa83489906 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 760.294166] env[68964]: DEBUG nova.compute.provider_tree [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 760.303044] env[68964]: DEBUG nova.scheduler.client.report [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 760.315585] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.485s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 760.315889] env[68964]: ERROR nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 760.315889] env[68964]: Faults: ['InvalidArgument'] [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Traceback (most recent call last): [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] self.driver.spawn(context, instance, image_meta, [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] self._vmops.spawn(context, instance, image_meta, injected_files, [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] self._fetch_image_if_missing(context, vi) [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] image_cache(vi, tmp_image_ds_loc) [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] vm_util.copy_virtual_disk( [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] session._wait_for_task(vmdk_copy_task) [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] return self.wait_for_task(task_ref) [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] return evt.wait() [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] result = hub.switch() [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] return self.greenlet.switch() [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] self.f(*self.args, **self.kw) [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] raise exceptions.translate_fault(task_info.error) [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Faults: ['InvalidArgument'] [ 760.315889] env[68964]: ERROR nova.compute.manager [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] [ 760.316759] env[68964]: DEBUG nova.compute.utils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 760.318150] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Build of instance 25b453f5-0b24-4c97-9a2d-6466e1489d07 was re-scheduled: A specified parameter was not correct: fileType [ 760.318150] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 760.318531] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 760.318704] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 760.318874] env[68964]: DEBUG nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 760.319051] env[68964]: DEBUG nova.network.neutron [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 760.722756] env[68964]: DEBUG nova.network.neutron [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.733668] env[68964]: INFO nova.compute.manager [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] [instance: 25b453f5-0b24-4c97-9a2d-6466e1489d07] Took 0.41 seconds to deallocate network for instance. [ 760.826291] env[68964]: INFO nova.scheduler.client.report [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Deleted allocations for instance 25b453f5-0b24-4c97-9a2d-6466e1489d07 [ 760.853576] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c33c8354-925f-4513-a50f-051545452115 tempest-ServersWithSpecificFlavorTestJSON-972500807 tempest-ServersWithSpecificFlavorTestJSON-972500807-project-member] Lock "25b453f5-0b24-4c97-9a2d-6466e1489d07" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 80.069s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 760.871277] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 760.933507] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 760.933791] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 760.936946] env[68964]: INFO nova.compute.claims [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 761.371023] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5389abb2-ddbf-404f-8209-b381548571c1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.378389] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d61a5116-30a9-4a92-8730-76726ed0d2df {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.408833] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-923c80e0-c347-4937-866c-bcd32053ff1f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.416562] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-295d8563-6742-4110-8738-984d2f865c1a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.429638] env[68964]: DEBUG nova.compute.provider_tree [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 761.440168] env[68964]: DEBUG nova.scheduler.client.report [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 761.454636] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.521s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 761.455126] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 761.489089] env[68964]: DEBUG nova.compute.utils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 761.489298] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 761.489573] env[68964]: DEBUG nova.network.neutron [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 761.498575] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 761.552534] env[68964]: DEBUG nova.policy [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e4e2fdc38308474fa90cc324dfe1b6f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '217178a834024f5a86365c3c4d8ca9b5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 761.562717] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 761.590369] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 761.590625] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 761.590780] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 761.591015] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 761.591202] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 761.591362] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 761.591577] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 761.591731] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 761.591933] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 761.592071] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 761.592252] env[68964]: DEBUG nova.virt.hardware [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 761.593158] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a212436c-16ea-49fc-9815-5f964ef1151a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.601523] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48919224-ae9c-41c6-ba14-7729497193ce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 761.975325] env[68964]: DEBUG nova.network.neutron [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Successfully created port: ff771c9c-8651-42ab-a87a-de9ae2a89d53 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 762.775861] env[68964]: DEBUG nova.network.neutron [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Successfully updated port: ff771c9c-8651-42ab-a87a-de9ae2a89d53 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 762.790380] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "refresh_cache-f86d97e4-42f3-464b-9d7b-7c05f19290ce" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 762.790507] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquired lock "refresh_cache-f86d97e4-42f3-464b-9d7b-7c05f19290ce" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 762.790662] env[68964]: DEBUG nova.network.neutron [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 762.839581] env[68964]: DEBUG nova.network.neutron [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.916514] env[68964]: DEBUG nova.compute.manager [req-b967dfe3-7ffc-4505-b6e9-9033a558d9b2 req-5e476ba3-5aa1-42b5-ae0e-3ca8eaedb897 service nova] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Received event network-vif-plugged-ff771c9c-8651-42ab-a87a-de9ae2a89d53 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 762.916735] env[68964]: DEBUG oslo_concurrency.lockutils [req-b967dfe3-7ffc-4505-b6e9-9033a558d9b2 req-5e476ba3-5aa1-42b5-ae0e-3ca8eaedb897 service nova] Acquiring lock "f86d97e4-42f3-464b-9d7b-7c05f19290ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 762.916949] env[68964]: DEBUG oslo_concurrency.lockutils [req-b967dfe3-7ffc-4505-b6e9-9033a558d9b2 req-5e476ba3-5aa1-42b5-ae0e-3ca8eaedb897 service nova] Lock "f86d97e4-42f3-464b-9d7b-7c05f19290ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 762.917238] env[68964]: DEBUG oslo_concurrency.lockutils [req-b967dfe3-7ffc-4505-b6e9-9033a558d9b2 req-5e476ba3-5aa1-42b5-ae0e-3ca8eaedb897 service nova] Lock "f86d97e4-42f3-464b-9d7b-7c05f19290ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 762.917418] env[68964]: DEBUG nova.compute.manager [req-b967dfe3-7ffc-4505-b6e9-9033a558d9b2 req-5e476ba3-5aa1-42b5-ae0e-3ca8eaedb897 service nova] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] No waiting events found dispatching network-vif-plugged-ff771c9c-8651-42ab-a87a-de9ae2a89d53 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 762.917627] env[68964]: WARNING nova.compute.manager [req-b967dfe3-7ffc-4505-b6e9-9033a558d9b2 req-5e476ba3-5aa1-42b5-ae0e-3ca8eaedb897 service nova] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Received unexpected event network-vif-plugged-ff771c9c-8651-42ab-a87a-de9ae2a89d53 for instance with vm_state building and task_state spawning. [ 763.105093] env[68964]: DEBUG nova.network.neutron [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Updating instance_info_cache with network_info: [{"id": "ff771c9c-8651-42ab-a87a-de9ae2a89d53", "address": "fa:16:3e:6f:10:a2", "network": {"id": "6a7fda40-db49-452a-b342-2cda28f5876b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-800300419-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "217178a834024f5a86365c3c4d8ca9b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapff771c9c-86", "ovs_interfaceid": "ff771c9c-8651-42ab-a87a-de9ae2a89d53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.118544] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Releasing lock "refresh_cache-f86d97e4-42f3-464b-9d7b-7c05f19290ce" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 763.118596] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Instance network_info: |[{"id": "ff771c9c-8651-42ab-a87a-de9ae2a89d53", "address": "fa:16:3e:6f:10:a2", "network": {"id": "6a7fda40-db49-452a-b342-2cda28f5876b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-800300419-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "217178a834024f5a86365c3c4d8ca9b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapff771c9c-86", "ovs_interfaceid": "ff771c9c-8651-42ab-a87a-de9ae2a89d53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 763.118942] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6f:10:a2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '21310d90-efbc-45a8-a97f-c4358606530f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ff771c9c-8651-42ab-a87a-de9ae2a89d53', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 763.126306] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Creating folder: Project (217178a834024f5a86365c3c4d8ca9b5). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 763.126916] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-865c90ee-4b88-4c92-84c7-859f35743c6c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.139123] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Created folder: Project (217178a834024f5a86365c3c4d8ca9b5) in parent group-v684465. [ 763.139123] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Creating folder: Instances. Parent ref: group-v684506. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 763.139123] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-56960712-1b32-46ce-8f21-edff6b1eb034 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.148464] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Created folder: Instances in parent group-v684506. [ 763.148682] env[68964]: DEBUG oslo.service.loopingcall [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 763.148856] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 763.149065] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c1e28ca0-b61d-4327-99a6-edb4c8543b7b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.167670] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 763.167670] env[68964]: value = "task-3431550" [ 763.167670] env[68964]: _type = "Task" [ 763.167670] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 763.175232] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431550, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 763.677397] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431550, 'name': CreateVM_Task, 'duration_secs': 0.338998} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 763.677568] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 763.678677] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 763.678677] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 763.678778] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 763.679079] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0de52c24-0820-4287-8072-5d59ee7b40b2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 763.683169] env[68964]: DEBUG oslo_vmware.api [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for the task: (returnval){ [ 763.683169] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5294effd-3e9f-acfe-1c8f-b5c74c21ee80" [ 763.683169] env[68964]: _type = "Task" [ 763.683169] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 763.693017] env[68964]: DEBUG oslo_vmware.api [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5294effd-3e9f-acfe-1c8f-b5c74c21ee80, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 764.194205] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 764.194544] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 764.194685] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 764.197089] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 764.197089] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 764.197089] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-24a77503-fe53-4e9d-a56f-5bfe66040a63 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.212130] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 764.212248] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 764.212978] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a3bf891c-b374-4db7-aba5-58985725cade {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.220143] env[68964]: DEBUG oslo_vmware.api [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for the task: (returnval){ [ 764.220143] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52df9179-1155-8731-d561-457ec72261cb" [ 764.220143] env[68964]: _type = "Task" [ 764.220143] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 764.228342] env[68964]: DEBUG oslo_vmware.api [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52df9179-1155-8731-d561-457ec72261cb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 764.730831] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 764.731118] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Creating directory with path [datastore1] vmware_temp/fd08d564-9c10-4049-8617-5190ab29117c/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 764.731361] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7abac200-0bd4-4ee4-9045-bbc6c01e71a9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.751662] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Created directory with path [datastore1] vmware_temp/fd08d564-9c10-4049-8617-5190ab29117c/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 764.751835] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Fetch image to [datastore1] vmware_temp/fd08d564-9c10-4049-8617-5190ab29117c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 764.752016] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/fd08d564-9c10-4049-8617-5190ab29117c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 764.752801] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f715ad5-bb0f-4141-b040-80b4a8379619 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.761772] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7f78c84-4386-4afd-832a-71052989265f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.771697] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baf18ed5-8fba-4ca3-9e6e-8dc9e6b9c48b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.805645] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4993972b-24f9-48e8-a34b-7eaa17919a35 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.813021] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-25ea4048-0ba3-4e44-821c-63dd9a61c688 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 764.844305] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 765.013826] env[68964]: DEBUG oslo_vmware.rw_handles [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fd08d564-9c10-4049-8617-5190ab29117c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 765.075027] env[68964]: DEBUG nova.compute.manager [req-0f6ec9d0-2e19-45e3-af91-7cc03c7064ce req-838fdd8a-31b2-4441-af9f-57b92244606f service nova] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Received event network-changed-ff771c9c-8651-42ab-a87a-de9ae2a89d53 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 765.075217] env[68964]: DEBUG nova.compute.manager [req-0f6ec9d0-2e19-45e3-af91-7cc03c7064ce req-838fdd8a-31b2-4441-af9f-57b92244606f service nova] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Refreshing instance network info cache due to event network-changed-ff771c9c-8651-42ab-a87a-de9ae2a89d53. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 765.075424] env[68964]: DEBUG oslo_concurrency.lockutils [req-0f6ec9d0-2e19-45e3-af91-7cc03c7064ce req-838fdd8a-31b2-4441-af9f-57b92244606f service nova] Acquiring lock "refresh_cache-f86d97e4-42f3-464b-9d7b-7c05f19290ce" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 765.075599] env[68964]: DEBUG oslo_concurrency.lockutils [req-0f6ec9d0-2e19-45e3-af91-7cc03c7064ce req-838fdd8a-31b2-4441-af9f-57b92244606f service nova] Acquired lock "refresh_cache-f86d97e4-42f3-464b-9d7b-7c05f19290ce" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 765.075751] env[68964]: DEBUG nova.network.neutron [req-0f6ec9d0-2e19-45e3-af91-7cc03c7064ce req-838fdd8a-31b2-4441-af9f-57b92244606f service nova] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Refreshing network info cache for port ff771c9c-8651-42ab-a87a-de9ae2a89d53 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 765.078807] env[68964]: DEBUG oslo_vmware.rw_handles [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 765.078926] env[68964]: DEBUG oslo_vmware.rw_handles [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fd08d564-9c10-4049-8617-5190ab29117c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 765.722928] env[68964]: DEBUG nova.network.neutron [req-0f6ec9d0-2e19-45e3-af91-7cc03c7064ce req-838fdd8a-31b2-4441-af9f-57b92244606f service nova] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Updated VIF entry in instance network info cache for port ff771c9c-8651-42ab-a87a-de9ae2a89d53. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 765.723304] env[68964]: DEBUG nova.network.neutron [req-0f6ec9d0-2e19-45e3-af91-7cc03c7064ce req-838fdd8a-31b2-4441-af9f-57b92244606f service nova] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Updating instance_info_cache with network_info: [{"id": "ff771c9c-8651-42ab-a87a-de9ae2a89d53", "address": "fa:16:3e:6f:10:a2", "network": {"id": "6a7fda40-db49-452a-b342-2cda28f5876b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-800300419-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "217178a834024f5a86365c3c4d8ca9b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapff771c9c-86", "ovs_interfaceid": "ff771c9c-8651-42ab-a87a-de9ae2a89d53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.734125] env[68964]: DEBUG oslo_concurrency.lockutils [req-0f6ec9d0-2e19-45e3-af91-7cc03c7064ce req-838fdd8a-31b2-4441-af9f-57b92244606f service nova] Releasing lock "refresh_cache-f86d97e4-42f3-464b-9d7b-7c05f19290ce" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 774.878151] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquiring lock "3770333e-4721-424d-ac86-2291c002e99a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 774.878151] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "3770333e-4721-424d-ac86-2291c002e99a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 779.724627] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 779.724859] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 779.736673] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 779.736888] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 779.737097] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 779.737266] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 779.738356] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1f233f7-86f7-4b28-97f3-82bb1c4b82a6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.747223] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39f2bc0f-b85e-40b2-ad67-f4ce582b93a4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.761322] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-babf03f6-8505-40c1-8b67-4b97c56f895f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.767904] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-883f0189-8e3c-466f-b054-b49f69cbdf55 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.797426] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180936MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 779.797578] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 779.797768] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 779.869596] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4da35ed9-8646-45b8-b66f-715195a405a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.869762] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 617f46a1-ca50-4561-9d0f-a596e35bf26d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.869890] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ed73ed7d-e299-472a-805c-32bf83e96f8d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.870029] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.870161] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.870280] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.870398] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 329835df-cb38-495e-8a0e-539a396ddc74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.870514] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.870628] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ff09bdbb-84e3-4182-8118-e99512a0e9de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.870743] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f86d97e4-42f3-464b-9d7b-7c05f19290ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 779.881749] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.893325] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cd99acb8-f78c-4c03-8c2e-2e9d50d18969 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.903587] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 323acb55-859a-4545-a046-1934cf98be6d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.913732] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fe587d8a-aa99-4163-a2a2-a97c0cfbb82a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.923563] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fa867511-44ae-47e6-8c05-5f2abf8eae88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.934516] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 888fb47f-5f48-415c-9289-61b9c42523e5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.944928] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c3ad57a5-1ea2-484a-b014-6276e0ee7914 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.955687] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7dbca935-17b3-4a4b-ae3e-558bc802f9b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.965711] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f37297e4-80b0-4c1d-b427-6a6a235bdc57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.979240] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8370a744-2602-410e-a509-e8487810e266 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.990176] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance d55ab849-9aa1-4d65-9a30-7c5eeb77d9a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 779.999496] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 0c909272-30a0-40b7-ad1d-90933925ff6f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.009167] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3005e937-65d2-4e41-8dd7-2fecaaa15365 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.018997] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.028798] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7c21c92e-16ed-4e2c-90d5-9391b1eeb703 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.039153] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 749fc36c-c3de-4762-bae7-515dec3c7377 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.048664] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 04e18d39-9cf6-4c0e-ae33-29e955827571 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.058141] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 92c1d7af-79e0-4cd9-a7e5-a969b4843778 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.067971] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4726af42-5678-4b56-8675-76e30156feaa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.077845] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb007e0d-124f-4ef6-85d7-c68b310e8b9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.088631] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 58afa2a4-da8c-4b32-9c76-587d082de444 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.098971] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 722f8bf7-1634-4190-9cc0-49b2a28c367e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.108959] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3770333e-4721-424d-ac86-2291c002e99a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 780.109138] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 780.109326] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 780.467108] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9ec4000-5114-4009-8eec-89962e26986d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.475697] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3724ac7c-7ca9-4eb5-b624-fb941c999942 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.506033] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35752736-ec50-4700-aef3-e320913fbd68 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.513305] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32f5cec5-f456-4dbe-ba2c-1002ffd13cf5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.526483] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 780.534704] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 780.549070] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 780.549266] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.751s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 781.544048] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 781.544315] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 781.544448] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 781.723963] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 781.724223] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 781.724388] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 781.724535] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 782.724857] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 782.725153] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 782.725204] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 782.746603] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.746770] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.746903] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.747111] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.747375] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.747524] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.747656] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.747795] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.747903] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.748033] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 782.748162] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 797.433271] env[68964]: WARNING oslo_vmware.rw_handles [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 797.433271] env[68964]: ERROR oslo_vmware.rw_handles [ 797.433271] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/885defea-512d-4ac0-9b72-60165c4ba7cc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 797.435303] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 797.435580] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Copying Virtual Disk [datastore2] vmware_temp/885defea-512d-4ac0-9b72-60165c4ba7cc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/885defea-512d-4ac0-9b72-60165c4ba7cc/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 797.435914] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-52c626de-687a-4feb-ba90-4484f13a4717 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.445296] env[68964]: DEBUG oslo_vmware.api [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for the task: (returnval){ [ 797.445296] env[68964]: value = "task-3431551" [ 797.445296] env[68964]: _type = "Task" [ 797.445296] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 797.453507] env[68964]: DEBUG oslo_vmware.api [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': task-3431551, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 797.955841] env[68964]: DEBUG oslo_vmware.exceptions [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 797.956162] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 797.956728] env[68964]: ERROR nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.956728] env[68964]: Faults: ['InvalidArgument'] [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Traceback (most recent call last): [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] yield resources [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] self.driver.spawn(context, instance, image_meta, [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] self._fetch_image_if_missing(context, vi) [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] image_cache(vi, tmp_image_ds_loc) [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] vm_util.copy_virtual_disk( [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] session._wait_for_task(vmdk_copy_task) [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] return self.wait_for_task(task_ref) [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] return evt.wait() [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] result = hub.switch() [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] return self.greenlet.switch() [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] self.f(*self.args, **self.kw) [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] raise exceptions.translate_fault(task_info.error) [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Faults: ['InvalidArgument'] [ 797.956728] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] [ 797.957775] env[68964]: INFO nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Terminating instance [ 797.958905] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 797.959128] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 797.959755] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 797.959942] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 797.960177] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fda472e0-de08-48e9-a36c-dd34939c1515 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.962617] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81933af3-c3c6-461e-b417-6687fc1e37fe {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.969452] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 797.969661] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e763f2e5-a37c-4b4f-a0ed-fe6e3bb2b9a4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.971749] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 797.971923] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 797.972874] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-227f4a66-d0e9-47c8-8bf0-bf7bf0f9ecf0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 797.977756] env[68964]: DEBUG oslo_vmware.api [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Waiting for the task: (returnval){ [ 797.977756] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a58f84-d616-a970-d7b6-fd218e3dfcad" [ 797.977756] env[68964]: _type = "Task" [ 797.977756] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 797.987322] env[68964]: DEBUG oslo_vmware.api [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a58f84-d616-a970-d7b6-fd218e3dfcad, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 798.040486] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 798.040690] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 798.040864] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Deleting the datastore file [datastore2] 4da35ed9-8646-45b8-b66f-715195a405a6 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 798.041145] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-06abab6c-4fac-44aa-93c7-0a08f0b52f90 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.046556] env[68964]: DEBUG oslo_vmware.api [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for the task: (returnval){ [ 798.046556] env[68964]: value = "task-3431553" [ 798.046556] env[68964]: _type = "Task" [ 798.046556] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 798.054044] env[68964]: DEBUG oslo_vmware.api [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': task-3431553, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 798.488816] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 798.488816] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Creating directory with path [datastore2] vmware_temp/13fc7aca-b22c-41da-814d-efd2e84560df/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 798.489280] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa3313a5-b94e-404f-8e3b-9bbfbbfad72f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.500172] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Created directory with path [datastore2] vmware_temp/13fc7aca-b22c-41da-814d-efd2e84560df/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 798.500381] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Fetch image to [datastore2] vmware_temp/13fc7aca-b22c-41da-814d-efd2e84560df/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 798.500510] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/13fc7aca-b22c-41da-814d-efd2e84560df/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 798.501266] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be7e1f69-73b6-4c75-988d-c4fd7a02ce9b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.508021] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda47cef-9bc5-4eab-b6ff-1cfa9e2d9e47 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.518440] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-300838ab-7521-4044-845c-acbe748eb57c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.558019] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9b984af-61c8-4941-8446-18e503ce59e4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.564605] env[68964]: DEBUG oslo_vmware.api [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': task-3431553, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078356} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 798.566106] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 798.566259] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 798.566451] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 798.566625] env[68964]: INFO nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Took 0.61 seconds to destroy the instance on the hypervisor. [ 798.572027] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0c097093-67a7-4dd4-a8e8-b5d3d1f68e62 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.572027] env[68964]: DEBUG nova.compute.claims [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 798.572027] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 798.572027] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 798.595562] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 798.658761] env[68964]: DEBUG oslo_vmware.rw_handles [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/13fc7aca-b22c-41da-814d-efd2e84560df/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 798.720801] env[68964]: DEBUG oslo_vmware.rw_handles [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 798.721008] env[68964]: DEBUG oslo_vmware.rw_handles [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/13fc7aca-b22c-41da-814d-efd2e84560df/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 799.080299] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ba3baf1-81a7-4548-9413-5216f30b38ff {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.088100] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f79121a9-5b2f-4756-ab03-30d11320ce13 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.118777] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78aff42b-c955-41d1-8915-2f72d46310b0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.125723] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71279c95-cd07-4c84-a1cf-30882e816ff3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.138917] env[68964]: DEBUG nova.compute.provider_tree [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 799.147447] env[68964]: DEBUG nova.scheduler.client.report [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 799.160688] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.590s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 799.161210] env[68964]: ERROR nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 799.161210] env[68964]: Faults: ['InvalidArgument'] [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Traceback (most recent call last): [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] self.driver.spawn(context, instance, image_meta, [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] self._fetch_image_if_missing(context, vi) [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] image_cache(vi, tmp_image_ds_loc) [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] vm_util.copy_virtual_disk( [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] session._wait_for_task(vmdk_copy_task) [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] return self.wait_for_task(task_ref) [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] return evt.wait() [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] result = hub.switch() [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] return self.greenlet.switch() [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] self.f(*self.args, **self.kw) [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] raise exceptions.translate_fault(task_info.error) [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Faults: ['InvalidArgument'] [ 799.161210] env[68964]: ERROR nova.compute.manager [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] [ 799.162093] env[68964]: DEBUG nova.compute.utils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 799.163332] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Build of instance 4da35ed9-8646-45b8-b66f-715195a405a6 was re-scheduled: A specified parameter was not correct: fileType [ 799.163332] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 799.163690] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 799.163861] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 799.164025] env[68964]: DEBUG nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 799.164193] env[68964]: DEBUG nova.network.neutron [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 799.543498] env[68964]: DEBUG nova.network.neutron [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 799.554625] env[68964]: INFO nova.compute.manager [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: 4da35ed9-8646-45b8-b66f-715195a405a6] Took 0.39 seconds to deallocate network for instance. [ 799.695781] env[68964]: INFO nova.scheduler.client.report [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Deleted allocations for instance 4da35ed9-8646-45b8-b66f-715195a405a6 [ 799.722087] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9d26b72d-5608-41c5-904c-8503a2a017e1 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "4da35ed9-8646-45b8-b66f-715195a405a6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 148.033s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 799.747685] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 799.802594] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 799.802716] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 799.804380] env[68964]: INFO nova.compute.claims [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 800.287196] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40fd7d43-5012-4cc2-b2c0-6bc991ede3e9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.293425] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9998fa4c-c1f9-4196-8ac6-ad885cfc2540 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.324405] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5ae921e-edb2-47d1-a210-de30b877434d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.331990] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8e7fbcf-8f8a-4833-9024-c73aa08629b9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.345646] env[68964]: DEBUG nova.compute.provider_tree [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 800.353902] env[68964]: DEBUG nova.scheduler.client.report [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 800.372466] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.570s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.372965] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 800.410306] env[68964]: DEBUG nova.compute.utils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 800.411950] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 800.412156] env[68964]: DEBUG nova.network.neutron [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 800.423939] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 800.490405] env[68964]: DEBUG nova.policy [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dcc7c9f8bf1340deb18209158b32ef39', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e24bc6fb36dc43ef9388fac48dabb8f1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 800.498315] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 800.526007] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 800.526288] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 800.526503] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 800.526693] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 800.526838] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 800.526981] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 800.527213] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 800.527438] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 800.527627] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 800.527793] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 800.527972] env[68964]: DEBUG nova.virt.hardware [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 800.528961] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ccda401-b80c-416e-a136-327f1c5295ff {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.536757] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-694bcdc8-4dbd-4741-92cb-5f636928e751 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 800.934233] env[68964]: DEBUG nova.network.neutron [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Successfully created port: 31fb06b2-dfe7-4a9a-95f1-aff47f607bb2 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 801.803241] env[68964]: DEBUG nova.network.neutron [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Successfully updated port: 31fb06b2-dfe7-4a9a-95f1-aff47f607bb2 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 801.817737] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquiring lock "refresh_cache-5b2a39da-1d95-4b2d-ab0e-8440f4544ef5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 801.817737] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquired lock "refresh_cache-5b2a39da-1d95-4b2d-ab0e-8440f4544ef5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 801.817737] env[68964]: DEBUG nova.network.neutron [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 801.825128] env[68964]: DEBUG nova.compute.manager [req-4eb4261b-73fa-4c04-8495-4f378ec85c41 req-8152f7fb-bf59-4271-a41c-bae729294e51 service nova] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Received event network-vif-plugged-31fb06b2-dfe7-4a9a-95f1-aff47f607bb2 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 801.825128] env[68964]: DEBUG oslo_concurrency.lockutils [req-4eb4261b-73fa-4c04-8495-4f378ec85c41 req-8152f7fb-bf59-4271-a41c-bae729294e51 service nova] Acquiring lock "5b2a39da-1d95-4b2d-ab0e-8440f4544ef5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 801.825128] env[68964]: DEBUG oslo_concurrency.lockutils [req-4eb4261b-73fa-4c04-8495-4f378ec85c41 req-8152f7fb-bf59-4271-a41c-bae729294e51 service nova] Lock "5b2a39da-1d95-4b2d-ab0e-8440f4544ef5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 801.825128] env[68964]: DEBUG oslo_concurrency.lockutils [req-4eb4261b-73fa-4c04-8495-4f378ec85c41 req-8152f7fb-bf59-4271-a41c-bae729294e51 service nova] Lock "5b2a39da-1d95-4b2d-ab0e-8440f4544ef5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.825128] env[68964]: DEBUG nova.compute.manager [req-4eb4261b-73fa-4c04-8495-4f378ec85c41 req-8152f7fb-bf59-4271-a41c-bae729294e51 service nova] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] No waiting events found dispatching network-vif-plugged-31fb06b2-dfe7-4a9a-95f1-aff47f607bb2 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 801.825128] env[68964]: WARNING nova.compute.manager [req-4eb4261b-73fa-4c04-8495-4f378ec85c41 req-8152f7fb-bf59-4271-a41c-bae729294e51 service nova] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Received unexpected event network-vif-plugged-31fb06b2-dfe7-4a9a-95f1-aff47f607bb2 for instance with vm_state building and task_state spawning. [ 801.864332] env[68964]: DEBUG nova.network.neutron [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 802.081858] env[68964]: DEBUG nova.network.neutron [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Updating instance_info_cache with network_info: [{"id": "31fb06b2-dfe7-4a9a-95f1-aff47f607bb2", "address": "fa:16:3e:0c:bc:8d", "network": {"id": "def91976-138d-432a-b50c-cf10946eb200", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1995902156-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e24bc6fb36dc43ef9388fac48dabb8f1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e028024-a9c1-4cae-8849-ea770a7ae0e4", "external-id": "nsx-vlan-transportzone-919", "segmentation_id": 919, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31fb06b2-df", "ovs_interfaceid": "31fb06b2-dfe7-4a9a-95f1-aff47f607bb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 802.099208] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Releasing lock "refresh_cache-5b2a39da-1d95-4b2d-ab0e-8440f4544ef5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 802.099514] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Instance network_info: |[{"id": "31fb06b2-dfe7-4a9a-95f1-aff47f607bb2", "address": "fa:16:3e:0c:bc:8d", "network": {"id": "def91976-138d-432a-b50c-cf10946eb200", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1995902156-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e24bc6fb36dc43ef9388fac48dabb8f1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e028024-a9c1-4cae-8849-ea770a7ae0e4", "external-id": "nsx-vlan-transportzone-919", "segmentation_id": 919, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31fb06b2-df", "ovs_interfaceid": "31fb06b2-dfe7-4a9a-95f1-aff47f607bb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 802.099886] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0c:bc:8d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8e028024-a9c1-4cae-8849-ea770a7ae0e4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '31fb06b2-dfe7-4a9a-95f1-aff47f607bb2', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 802.112041] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Creating folder: Project (e24bc6fb36dc43ef9388fac48dabb8f1). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 802.112041] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ec152be6-c8d6-4e83-b9b1-9294c4bf2ebc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.123583] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Created folder: Project (e24bc6fb36dc43ef9388fac48dabb8f1) in parent group-v684465. [ 802.123766] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Creating folder: Instances. Parent ref: group-v684509. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 802.124010] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-34d41257-3554-4d3c-be3a-dd627bf7af52 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.131981] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Created folder: Instances in parent group-v684509. [ 802.132242] env[68964]: DEBUG oslo.service.loopingcall [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 802.132458] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 802.132676] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-213e509c-362d-43b7-bbe9-979b5691393a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.152442] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 802.152442] env[68964]: value = "task-3431556" [ 802.152442] env[68964]: _type = "Task" [ 802.152442] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 802.159866] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431556, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 802.328035] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 802.328275] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 802.662515] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431556, 'name': CreateVM_Task} progress is 99%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 803.162997] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431556, 'name': CreateVM_Task} progress is 99%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 803.662946] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431556, 'name': CreateVM_Task, 'duration_secs': 1.299092} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 803.663183] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 803.663861] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 803.664056] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 803.664366] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 803.664612] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-af263c0c-fe31-42e6-be14-626218cbcda9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 803.668944] env[68964]: DEBUG oslo_vmware.api [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Waiting for the task: (returnval){ [ 803.668944] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d200dc-23d3-61a3-3781-6b55825a0628" [ 803.668944] env[68964]: _type = "Task" [ 803.668944] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 803.676105] env[68964]: DEBUG oslo_vmware.api [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d200dc-23d3-61a3-3781-6b55825a0628, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 803.854442] env[68964]: DEBUG nova.compute.manager [req-5afa599c-bdf6-4a03-a2c9-b81bb6b0c022 req-287f74f2-aba0-4f80-adb2-623346532501 service nova] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Received event network-changed-31fb06b2-dfe7-4a9a-95f1-aff47f607bb2 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 803.854697] env[68964]: DEBUG nova.compute.manager [req-5afa599c-bdf6-4a03-a2c9-b81bb6b0c022 req-287f74f2-aba0-4f80-adb2-623346532501 service nova] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Refreshing instance network info cache due to event network-changed-31fb06b2-dfe7-4a9a-95f1-aff47f607bb2. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 803.854852] env[68964]: DEBUG oslo_concurrency.lockutils [req-5afa599c-bdf6-4a03-a2c9-b81bb6b0c022 req-287f74f2-aba0-4f80-adb2-623346532501 service nova] Acquiring lock "refresh_cache-5b2a39da-1d95-4b2d-ab0e-8440f4544ef5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 803.854994] env[68964]: DEBUG oslo_concurrency.lockutils [req-5afa599c-bdf6-4a03-a2c9-b81bb6b0c022 req-287f74f2-aba0-4f80-adb2-623346532501 service nova] Acquired lock "refresh_cache-5b2a39da-1d95-4b2d-ab0e-8440f4544ef5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 803.855419] env[68964]: DEBUG nova.network.neutron [req-5afa599c-bdf6-4a03-a2c9-b81bb6b0c022 req-287f74f2-aba0-4f80-adb2-623346532501 service nova] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Refreshing network info cache for port 31fb06b2-dfe7-4a9a-95f1-aff47f607bb2 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 804.179479] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 804.179801] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 804.179950] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 804.354809] env[68964]: DEBUG nova.network.neutron [req-5afa599c-bdf6-4a03-a2c9-b81bb6b0c022 req-287f74f2-aba0-4f80-adb2-623346532501 service nova] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Updated VIF entry in instance network info cache for port 31fb06b2-dfe7-4a9a-95f1-aff47f607bb2. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 804.355197] env[68964]: DEBUG nova.network.neutron [req-5afa599c-bdf6-4a03-a2c9-b81bb6b0c022 req-287f74f2-aba0-4f80-adb2-623346532501 service nova] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Updating instance_info_cache with network_info: [{"id": "31fb06b2-dfe7-4a9a-95f1-aff47f607bb2", "address": "fa:16:3e:0c:bc:8d", "network": {"id": "def91976-138d-432a-b50c-cf10946eb200", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1995902156-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e24bc6fb36dc43ef9388fac48dabb8f1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e028024-a9c1-4cae-8849-ea770a7ae0e4", "external-id": "nsx-vlan-transportzone-919", "segmentation_id": 919, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31fb06b2-df", "ovs_interfaceid": "31fb06b2-dfe7-4a9a-95f1-aff47f607bb2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 804.364461] env[68964]: DEBUG oslo_concurrency.lockutils [req-5afa599c-bdf6-4a03-a2c9-b81bb6b0c022 req-287f74f2-aba0-4f80-adb2-623346532501 service nova] Releasing lock "refresh_cache-5b2a39da-1d95-4b2d-ab0e-8440f4544ef5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 811.886896] env[68964]: WARNING oslo_vmware.rw_handles [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 811.886896] env[68964]: ERROR oslo_vmware.rw_handles [ 811.887499] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/fd08d564-9c10-4049-8617-5190ab29117c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 811.888841] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 811.889098] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Copying Virtual Disk [datastore1] vmware_temp/fd08d564-9c10-4049-8617-5190ab29117c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/fd08d564-9c10-4049-8617-5190ab29117c/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 811.889385] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-74b8f60f-173f-41f8-b1fc-a080bd4b6221 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 811.898330] env[68964]: DEBUG oslo_vmware.api [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for the task: (returnval){ [ 811.898330] env[68964]: value = "task-3431557" [ 811.898330] env[68964]: _type = "Task" [ 811.898330] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 811.907120] env[68964]: DEBUG oslo_vmware.api [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': task-3431557, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 812.409260] env[68964]: DEBUG oslo_vmware.exceptions [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 812.409556] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 812.410107] env[68964]: ERROR nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 812.410107] env[68964]: Faults: ['InvalidArgument'] [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Traceback (most recent call last): [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] yield resources [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] self.driver.spawn(context, instance, image_meta, [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] self._fetch_image_if_missing(context, vi) [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] image_cache(vi, tmp_image_ds_loc) [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] vm_util.copy_virtual_disk( [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] session._wait_for_task(vmdk_copy_task) [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] return self.wait_for_task(task_ref) [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] return evt.wait() [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] result = hub.switch() [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] return self.greenlet.switch() [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] self.f(*self.args, **self.kw) [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] raise exceptions.translate_fault(task_info.error) [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Faults: ['InvalidArgument'] [ 812.410107] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] [ 812.411074] env[68964]: INFO nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Terminating instance [ 812.411915] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 812.412129] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 812.412362] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4451f9f1-b3b5-4f05-b883-689f019f1e2a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.414584] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 812.414774] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 812.415510] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4366a69d-fc06-4d1c-8f39-b62133c8bd22 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.422481] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 812.422697] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-90866f20-3eac-4f16-8aca-2194ef92c861 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.424896] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 812.425073] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 812.426075] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d0be1bff-2343-4a9f-bbe3-d441135ff09e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.430843] env[68964]: DEBUG oslo_vmware.api [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Waiting for the task: (returnval){ [ 812.430843] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52035629-380e-1bd5-5427-640a3b7c4b4a" [ 812.430843] env[68964]: _type = "Task" [ 812.430843] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 812.437958] env[68964]: DEBUG oslo_vmware.api [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52035629-380e-1bd5-5427-640a3b7c4b4a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 812.494778] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 812.494945] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 812.495142] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Deleting the datastore file [datastore1] f86d97e4-42f3-464b-9d7b-7c05f19290ce {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 812.495393] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6bff32bf-a71d-4b57-bd49-c256be76b1fc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.501061] env[68964]: DEBUG oslo_vmware.api [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for the task: (returnval){ [ 812.501061] env[68964]: value = "task-3431559" [ 812.501061] env[68964]: _type = "Task" [ 812.501061] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 812.508447] env[68964]: DEBUG oslo_vmware.api [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': task-3431559, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 812.941057] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 812.941342] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Creating directory with path [datastore1] vmware_temp/eb1c5082-7623-4c2c-9d22-6935e2d232ba/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 812.941579] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-46c1be04-d656-42c0-80f3-b399fded32ac {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.953293] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Created directory with path [datastore1] vmware_temp/eb1c5082-7623-4c2c-9d22-6935e2d232ba/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 812.953508] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Fetch image to [datastore1] vmware_temp/eb1c5082-7623-4c2c-9d22-6935e2d232ba/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 812.953701] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/eb1c5082-7623-4c2c-9d22-6935e2d232ba/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 812.954455] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87a13b32-6305-4140-94f7-933e8a33bb0a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.961437] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-173de677-6fde-47d7-9f0a-de3db08bb80a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 812.970655] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03cbaf3b-de00-4552-a1d8-ac667753016d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.006125] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-024cb2f0-66f8-4ea4-86c2-b09d904703c6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.016162] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0cd27b3a-105f-4629-bb2f-d449eab97983 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.017781] env[68964]: DEBUG oslo_vmware.api [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': task-3431559, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06952} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 813.017781] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 813.017781] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 813.017950] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 813.018054] env[68964]: INFO nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Took 0.60 seconds to destroy the instance on the hypervisor. [ 813.020316] env[68964]: DEBUG nova.compute.claims [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 813.020495] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 813.020706] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 813.047099] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 813.110693] env[68964]: DEBUG oslo_vmware.rw_handles [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/eb1c5082-7623-4c2c-9d22-6935e2d232ba/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 813.173024] env[68964]: DEBUG oslo_vmware.rw_handles [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 813.173024] env[68964]: DEBUG oslo_vmware.rw_handles [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/eb1c5082-7623-4c2c-9d22-6935e2d232ba/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 813.505279] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31b02d7d-09cb-46ef-ad4c-a09763e3acb1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.512932] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d0b5428-6d27-4ab7-a06c-b2f4d57667ce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.542219] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99d1b8e8-0767-4a17-a822-26b935c451c7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.548692] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93962a65-a15c-4e0a-9043-24e5985524e0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.561218] env[68964]: DEBUG nova.compute.provider_tree [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 813.571900] env[68964]: DEBUG nova.scheduler.client.report [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 813.585827] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.565s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 813.586413] env[68964]: ERROR nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 813.586413] env[68964]: Faults: ['InvalidArgument'] [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Traceback (most recent call last): [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] self.driver.spawn(context, instance, image_meta, [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] self._fetch_image_if_missing(context, vi) [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] image_cache(vi, tmp_image_ds_loc) [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] vm_util.copy_virtual_disk( [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] session._wait_for_task(vmdk_copy_task) [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] return self.wait_for_task(task_ref) [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] return evt.wait() [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] result = hub.switch() [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] return self.greenlet.switch() [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] self.f(*self.args, **self.kw) [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] raise exceptions.translate_fault(task_info.error) [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Faults: ['InvalidArgument'] [ 813.586413] env[68964]: ERROR nova.compute.manager [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] [ 813.587362] env[68964]: DEBUG nova.compute.utils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 813.589073] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Build of instance f86d97e4-42f3-464b-9d7b-7c05f19290ce was re-scheduled: A specified parameter was not correct: fileType [ 813.589073] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 813.589539] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 813.589741] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 813.589931] env[68964]: DEBUG nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 813.590114] env[68964]: DEBUG nova.network.neutron [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 813.965834] env[68964]: DEBUG nova.network.neutron [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 813.977257] env[68964]: INFO nova.compute.manager [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: f86d97e4-42f3-464b-9d7b-7c05f19290ce] Took 0.39 seconds to deallocate network for instance. [ 814.080285] env[68964]: INFO nova.scheduler.client.report [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Deleted allocations for instance f86d97e4-42f3-464b-9d7b-7c05f19290ce [ 814.101478] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55fc074a-96f6-42f6-ac28-c43406272b32 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "f86d97e4-42f3-464b-9d7b-7c05f19290ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 130.724s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 814.123610] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 814.171631] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 814.171896] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 814.173373] env[68964]: INFO nova.compute.claims [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 814.592284] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ef273e1-de6c-4fe3-b62b-a45b894827e9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.600729] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b0749e4-cb3c-4d2e-b070-c238f50d516c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.631067] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fa8d3a5-8bc5-4743-adf2-b4943b0f8b4f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.638522] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ee0f84a-ac95-4b5f-b7be-1511169dee0d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.652882] env[68964]: DEBUG nova.compute.provider_tree [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 814.662016] env[68964]: DEBUG nova.scheduler.client.report [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 814.677245] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.505s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 814.677729] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 814.713883] env[68964]: DEBUG nova.compute.utils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 814.715145] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 814.715323] env[68964]: DEBUG nova.network.neutron [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 814.723157] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 814.809344] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 814.816456] env[68964]: DEBUG nova.policy [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56a0f29602bd49e4a9b2e3fde21b8236', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6cdd0c10c06d4d55af1dd42121fb3692', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 814.850598] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 814.850902] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 814.851112] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 814.851336] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 814.851522] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 814.851706] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 814.851955] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 814.852168] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 814.852374] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 814.852573] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 814.852784] env[68964]: DEBUG nova.virt.hardware [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 814.853671] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b8de654-34b3-4e1c-a116-0a838992c3dc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.861882] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e664e3e0-2f9b-43ee-9703-fabec11f5dcd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 815.275857] env[68964]: DEBUG nova.network.neutron [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Successfully created port: f5d68518-253f-4997-87c9-fdf6ee36d9eb {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 816.096705] env[68964]: DEBUG nova.network.neutron [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Successfully updated port: f5d68518-253f-4997-87c9-fdf6ee36d9eb {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 816.112450] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquiring lock "refresh_cache-cd99acb8-f78c-4c03-8c2e-2e9d50d18969" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 816.112450] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquired lock "refresh_cache-cd99acb8-f78c-4c03-8c2e-2e9d50d18969" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 816.112450] env[68964]: DEBUG nova.network.neutron [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 816.173920] env[68964]: DEBUG nova.network.neutron [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 816.392395] env[68964]: DEBUG nova.network.neutron [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Updating instance_info_cache with network_info: [{"id": "f5d68518-253f-4997-87c9-fdf6ee36d9eb", "address": "fa:16:3e:48:8b:9e", "network": {"id": "b2e96435-2155-409c-9d63-91fb9cd4be6d", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1506989777-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6cdd0c10c06d4d55af1dd42121fb3692", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cae1d6a8-cbba-4bbf-af10-ba5467340475", "external-id": "nsx-vlan-transportzone-271", "segmentation_id": 271, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf5d68518-25", "ovs_interfaceid": "f5d68518-253f-4997-87c9-fdf6ee36d9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 816.404444] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Releasing lock "refresh_cache-cd99acb8-f78c-4c03-8c2e-2e9d50d18969" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 816.404667] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Instance network_info: |[{"id": "f5d68518-253f-4997-87c9-fdf6ee36d9eb", "address": "fa:16:3e:48:8b:9e", "network": {"id": "b2e96435-2155-409c-9d63-91fb9cd4be6d", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1506989777-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6cdd0c10c06d4d55af1dd42121fb3692", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cae1d6a8-cbba-4bbf-af10-ba5467340475", "external-id": "nsx-vlan-transportzone-271", "segmentation_id": 271, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf5d68518-25", "ovs_interfaceid": "f5d68518-253f-4997-87c9-fdf6ee36d9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 816.405072] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:48:8b:9e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cae1d6a8-cbba-4bbf-af10-ba5467340475', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f5d68518-253f-4997-87c9-fdf6ee36d9eb', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 816.412833] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Creating folder: Project (6cdd0c10c06d4d55af1dd42121fb3692). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 816.413423] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-caeb919a-a256-42e5-9130-4d44658db6df {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.418102] env[68964]: DEBUG nova.compute.manager [req-33659492-3767-40d4-8df3-c899ad4061f9 req-cd67421e-d784-4c31-9c92-ce9b6539babd service nova] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Received event network-vif-plugged-f5d68518-253f-4997-87c9-fdf6ee36d9eb {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 816.418284] env[68964]: DEBUG oslo_concurrency.lockutils [req-33659492-3767-40d4-8df3-c899ad4061f9 req-cd67421e-d784-4c31-9c92-ce9b6539babd service nova] Acquiring lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 816.418477] env[68964]: DEBUG oslo_concurrency.lockutils [req-33659492-3767-40d4-8df3-c899ad4061f9 req-cd67421e-d784-4c31-9c92-ce9b6539babd service nova] Lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 816.418644] env[68964]: DEBUG oslo_concurrency.lockutils [req-33659492-3767-40d4-8df3-c899ad4061f9 req-cd67421e-d784-4c31-9c92-ce9b6539babd service nova] Lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 816.418812] env[68964]: DEBUG nova.compute.manager [req-33659492-3767-40d4-8df3-c899ad4061f9 req-cd67421e-d784-4c31-9c92-ce9b6539babd service nova] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] No waiting events found dispatching network-vif-plugged-f5d68518-253f-4997-87c9-fdf6ee36d9eb {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 816.418972] env[68964]: WARNING nova.compute.manager [req-33659492-3767-40d4-8df3-c899ad4061f9 req-cd67421e-d784-4c31-9c92-ce9b6539babd service nova] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Received unexpected event network-vif-plugged-f5d68518-253f-4997-87c9-fdf6ee36d9eb for instance with vm_state building and task_state spawning. [ 816.426346] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Created folder: Project (6cdd0c10c06d4d55af1dd42121fb3692) in parent group-v684465. [ 816.426528] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Creating folder: Instances. Parent ref: group-v684512. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 816.426781] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4812f3a0-dc75-443c-af5e-9e05f9829fc0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.435029] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Created folder: Instances in parent group-v684512. [ 816.435313] env[68964]: DEBUG oslo.service.loopingcall [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 816.435461] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 816.435650] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e0d40cb1-07e6-4489-89eb-6c73862a4b55 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.454585] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 816.454585] env[68964]: value = "task-3431562" [ 816.454585] env[68964]: _type = "Task" [ 816.454585] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 816.461980] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431562, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 816.965727] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431562, 'name': CreateVM_Task, 'duration_secs': 0.294144} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 816.965896] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 816.966656] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 816.966883] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 816.967235] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 816.967523] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bc322e52-6eea-4a5c-9e5a-61a76521ec51 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.971878] env[68964]: DEBUG oslo_vmware.api [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Waiting for the task: (returnval){ [ 816.971878] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a8ec58-21ed-d4fb-f0aa-5a5ea9c1a00c" [ 816.971878] env[68964]: _type = "Task" [ 816.971878] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 816.980165] env[68964]: DEBUG oslo_vmware.api [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a8ec58-21ed-d4fb-f0aa-5a5ea9c1a00c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 817.482072] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 817.482372] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 817.482540] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 818.574036] env[68964]: DEBUG nova.compute.manager [req-9e148f21-9f6d-4dfa-b601-dc796589464c req-02bbdb55-13b1-4f2d-91ac-f2608c9b8edd service nova] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Received event network-changed-f5d68518-253f-4997-87c9-fdf6ee36d9eb {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 818.574036] env[68964]: DEBUG nova.compute.manager [req-9e148f21-9f6d-4dfa-b601-dc796589464c req-02bbdb55-13b1-4f2d-91ac-f2608c9b8edd service nova] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Refreshing instance network info cache due to event network-changed-f5d68518-253f-4997-87c9-fdf6ee36d9eb. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 818.574036] env[68964]: DEBUG oslo_concurrency.lockutils [req-9e148f21-9f6d-4dfa-b601-dc796589464c req-02bbdb55-13b1-4f2d-91ac-f2608c9b8edd service nova] Acquiring lock "refresh_cache-cd99acb8-f78c-4c03-8c2e-2e9d50d18969" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 818.574475] env[68964]: DEBUG oslo_concurrency.lockutils [req-9e148f21-9f6d-4dfa-b601-dc796589464c req-02bbdb55-13b1-4f2d-91ac-f2608c9b8edd service nova] Acquired lock "refresh_cache-cd99acb8-f78c-4c03-8c2e-2e9d50d18969" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 818.574475] env[68964]: DEBUG nova.network.neutron [req-9e148f21-9f6d-4dfa-b601-dc796589464c req-02bbdb55-13b1-4f2d-91ac-f2608c9b8edd service nova] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Refreshing network info cache for port f5d68518-253f-4997-87c9-fdf6ee36d9eb {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 819.179748] env[68964]: DEBUG nova.network.neutron [req-9e148f21-9f6d-4dfa-b601-dc796589464c req-02bbdb55-13b1-4f2d-91ac-f2608c9b8edd service nova] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Updated VIF entry in instance network info cache for port f5d68518-253f-4997-87c9-fdf6ee36d9eb. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 819.180170] env[68964]: DEBUG nova.network.neutron [req-9e148f21-9f6d-4dfa-b601-dc796589464c req-02bbdb55-13b1-4f2d-91ac-f2608c9b8edd service nova] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Updating instance_info_cache with network_info: [{"id": "f5d68518-253f-4997-87c9-fdf6ee36d9eb", "address": "fa:16:3e:48:8b:9e", "network": {"id": "b2e96435-2155-409c-9d63-91fb9cd4be6d", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1506989777-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6cdd0c10c06d4d55af1dd42121fb3692", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cae1d6a8-cbba-4bbf-af10-ba5467340475", "external-id": "nsx-vlan-transportzone-271", "segmentation_id": 271, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf5d68518-25", "ovs_interfaceid": "f5d68518-253f-4997-87c9-fdf6ee36d9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 819.189497] env[68964]: DEBUG oslo_concurrency.lockutils [req-9e148f21-9f6d-4dfa-b601-dc796589464c req-02bbdb55-13b1-4f2d-91ac-f2608c9b8edd service nova] Releasing lock "refresh_cache-cd99acb8-f78c-4c03-8c2e-2e9d50d18969" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 819.813926] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 819.814228] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 839.742762] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 840.724299] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 840.724540] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 840.738290] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 840.738290] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 840.738459] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 840.738564] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 840.739852] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac008e0c-18df-4e54-a4ac-1969d6788e07 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 840.748643] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-287fba6d-24ea-4abb-ba0b-259da97b6c3c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 840.763762] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7bf7c0d-0ce4-4dc7-b4b4-dec09e1a5e40 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 840.770149] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2483a6ef-6d94-4a29-ade7-e8865d15cd9d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 840.799817] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 840.799999] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 840.800208] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 840.872222] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 617f46a1-ca50-4561-9d0f-a596e35bf26d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.872412] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ed73ed7d-e299-472a-805c-32bf83e96f8d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.872543] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.872665] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.872785] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.873033] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 329835df-cb38-495e-8a0e-539a396ddc74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.873033] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.873159] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ff09bdbb-84e3-4182-8118-e99512a0e9de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.873247] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.873361] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cd99acb8-f78c-4c03-8c2e-2e9d50d18969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 840.886098] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 323acb55-859a-4545-a046-1934cf98be6d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.896773] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fe587d8a-aa99-4163-a2a2-a97c0cfbb82a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.907482] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fa867511-44ae-47e6-8c05-5f2abf8eae88 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.918221] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 888fb47f-5f48-415c-9289-61b9c42523e5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.928362] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c3ad57a5-1ea2-484a-b014-6276e0ee7914 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.940254] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7dbca935-17b3-4a4b-ae3e-558bc802f9b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.950964] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f37297e4-80b0-4c1d-b427-6a6a235bdc57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.961030] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8370a744-2602-410e-a509-e8487810e266 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.973082] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance d55ab849-9aa1-4d65-9a30-7c5eeb77d9a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.982431] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 0c909272-30a0-40b7-ad1d-90933925ff6f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 840.991760] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3005e937-65d2-4e41-8dd7-2fecaaa15365 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.001467] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.012167] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7c21c92e-16ed-4e2c-90d5-9391b1eeb703 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.021599] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 749fc36c-c3de-4762-bae7-515dec3c7377 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.031105] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 04e18d39-9cf6-4c0e-ae33-29e955827571 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.040060] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 92c1d7af-79e0-4cd9-a7e5-a969b4843778 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.049756] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4726af42-5678-4b56-8675-76e30156feaa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.060439] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb007e0d-124f-4ef6-85d7-c68b310e8b9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.073125] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 58afa2a4-da8c-4b32-9c76-587d082de444 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.082759] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 722f8bf7-1634-4190-9cc0-49b2a28c367e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.091903] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3770333e-4721-424d-ac86-2291c002e99a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.100862] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.109700] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 841.109917] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 841.110074] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 841.483263] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d69bb5f0-5a27-47dd-b488-23f7376f35fa {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.490910] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e109516-cb73-4f9c-9092-b0352a687855 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.521694] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e012105a-9905-41e2-a0bb-a76e68d05480 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.529078] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-646bd9d9-682d-463d-9ab3-a640806186bb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 841.542738] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 841.551320] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 841.565308] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 841.565499] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.765s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 842.561081] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 842.561489] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 842.561521] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 842.561667] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 842.561811] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 842.724241] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 843.724366] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 844.724684] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 844.724977] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 844.725027] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 844.745437] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.745634] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.745798] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.745971] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.746232] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.746415] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.746553] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.746683] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.746802] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.746931] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 844.747046] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 848.742538] env[68964]: WARNING oslo_vmware.rw_handles [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 848.742538] env[68964]: ERROR oslo_vmware.rw_handles [ 848.743318] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/13fc7aca-b22c-41da-814d-efd2e84560df/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 848.744746] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 848.745008] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Copying Virtual Disk [datastore2] vmware_temp/13fc7aca-b22c-41da-814d-efd2e84560df/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/13fc7aca-b22c-41da-814d-efd2e84560df/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 848.745314] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d60e3dec-e690-41e2-b6d0-ade66b50efe3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.755240] env[68964]: DEBUG oslo_vmware.api [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Waiting for the task: (returnval){ [ 848.755240] env[68964]: value = "task-3431563" [ 848.755240] env[68964]: _type = "Task" [ 848.755240] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 848.763158] env[68964]: DEBUG oslo_vmware.api [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Task: {'id': task-3431563, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 849.265568] env[68964]: DEBUG oslo_vmware.exceptions [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 849.265870] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 849.266443] env[68964]: ERROR nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 849.266443] env[68964]: Faults: ['InvalidArgument'] [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Traceback (most recent call last): [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] yield resources [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] self.driver.spawn(context, instance, image_meta, [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] self._fetch_image_if_missing(context, vi) [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] image_cache(vi, tmp_image_ds_loc) [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] vm_util.copy_virtual_disk( [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] session._wait_for_task(vmdk_copy_task) [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] return self.wait_for_task(task_ref) [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] return evt.wait() [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] result = hub.switch() [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] return self.greenlet.switch() [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] self.f(*self.args, **self.kw) [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] raise exceptions.translate_fault(task_info.error) [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Faults: ['InvalidArgument'] [ 849.266443] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] [ 849.267461] env[68964]: INFO nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Terminating instance [ 849.268330] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 849.268524] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 849.268763] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2630a32b-094b-4de6-82fc-852eb1bc151e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.271101] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 849.271296] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 849.272055] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57f50ce4-ef1c-421e-a597-50bf216ed999 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.278642] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 849.278858] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-414cd4f2-0d87-4789-bdba-d4cf83993813 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.281037] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 849.281212] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 849.282145] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4a69cba-c1dd-45ee-af14-6f1b7a8d721f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.286809] env[68964]: DEBUG oslo_vmware.api [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Waiting for the task: (returnval){ [ 849.286809] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d3739e-bed3-2690-2784-98c1cbf81251" [ 849.286809] env[68964]: _type = "Task" [ 849.286809] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 849.295044] env[68964]: DEBUG oslo_vmware.api [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d3739e-bed3-2690-2784-98c1cbf81251, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 849.347580] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 849.347907] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 849.348184] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Deleting the datastore file [datastore2] ed73ed7d-e299-472a-805c-32bf83e96f8d {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 849.348543] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-890201ed-34d1-4b48-ae61-2934322c8598 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.355766] env[68964]: DEBUG oslo_vmware.api [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Waiting for the task: (returnval){ [ 849.355766] env[68964]: value = "task-3431565" [ 849.355766] env[68964]: _type = "Task" [ 849.355766] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 849.364526] env[68964]: DEBUG oslo_vmware.api [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Task: {'id': task-3431565, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 849.797611] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 849.797884] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Creating directory with path [datastore2] vmware_temp/7c0df2a1-dbba-4d06-bba7-128ccc0b34a1/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 849.798094] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4443eb4f-decf-4b3c-9eb1-c28c6462aba9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.811032] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Created directory with path [datastore2] vmware_temp/7c0df2a1-dbba-4d06-bba7-128ccc0b34a1/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 849.811032] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Fetch image to [datastore2] vmware_temp/7c0df2a1-dbba-4d06-bba7-128ccc0b34a1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 849.811032] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/7c0df2a1-dbba-4d06-bba7-128ccc0b34a1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 849.811032] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48fe7949-d87b-4c41-93da-774d7e21d483 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.817852] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea3f230e-f089-45a9-9595-47245ecc7579 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.827026] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-806abb41-4414-4923-84a2-6d46d2f1d03f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.862664] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-590adebc-eb10-42d3-8da4-1e558911e2e8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.871272] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f6249337-80d1-4dd3-b808-462f4f278ed3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.873056] env[68964]: DEBUG oslo_vmware.api [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Task: {'id': task-3431565, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076864} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 849.873329] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 849.873487] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 849.873653] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 849.873824] env[68964]: INFO nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 849.876134] env[68964]: DEBUG nova.compute.claims [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 849.876308] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.876530] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.896903] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 849.950438] env[68964]: DEBUG oslo_vmware.rw_handles [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7c0df2a1-dbba-4d06-bba7-128ccc0b34a1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 850.009832] env[68964]: DEBUG oslo_vmware.rw_handles [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 850.010029] env[68964]: DEBUG oslo_vmware.rw_handles [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7c0df2a1-dbba-4d06-bba7-128ccc0b34a1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 850.368731] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc4e5904-e046-4b52-a82b-4c76efcaba22 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.376662] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eed13742-6eaa-49d3-8d92-3c6b0bef5459 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.407779] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24f05af1-07ed-44c6-8bb6-4aaad43d1ad8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.415926] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca43eac5-a7aa-49e8-8600-74b3d29dcb28 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.432383] env[68964]: DEBUG nova.compute.provider_tree [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 850.440061] env[68964]: DEBUG nova.scheduler.client.report [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 850.459480] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.583s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 850.460073] env[68964]: ERROR nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 850.460073] env[68964]: Faults: ['InvalidArgument'] [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Traceback (most recent call last): [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] self.driver.spawn(context, instance, image_meta, [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] self._fetch_image_if_missing(context, vi) [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] image_cache(vi, tmp_image_ds_loc) [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] vm_util.copy_virtual_disk( [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] session._wait_for_task(vmdk_copy_task) [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] return self.wait_for_task(task_ref) [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] return evt.wait() [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] result = hub.switch() [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] return self.greenlet.switch() [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] self.f(*self.args, **self.kw) [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] raise exceptions.translate_fault(task_info.error) [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Faults: ['InvalidArgument'] [ 850.460073] env[68964]: ERROR nova.compute.manager [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] [ 850.460846] env[68964]: DEBUG nova.compute.utils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 850.462320] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Build of instance ed73ed7d-e299-472a-805c-32bf83e96f8d was re-scheduled: A specified parameter was not correct: fileType [ 850.462320] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 850.462681] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 850.462848] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 850.463007] env[68964]: DEBUG nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 850.463953] env[68964]: DEBUG nova.network.neutron [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 850.770215] env[68964]: DEBUG nova.network.neutron [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 850.789850] env[68964]: INFO nova.compute.manager [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] [instance: ed73ed7d-e299-472a-805c-32bf83e96f8d] Took 0.32 seconds to deallocate network for instance. [ 850.892574] env[68964]: INFO nova.scheduler.client.report [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Deleted allocations for instance ed73ed7d-e299-472a-805c-32bf83e96f8d [ 850.916658] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa0acc5-f50f-4e99-8da4-684271fac4f1 tempest-ServersAdminNegativeTestJSON-370703798 tempest-ServersAdminNegativeTestJSON-370703798-project-member] Lock "ed73ed7d-e299-472a-805c-32bf83e96f8d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.260s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 850.933608] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 850.982474] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.982720] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 850.984289] env[68964]: INFO nova.compute.claims [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 851.394216] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94d7e899-9e41-4bcb-97d8-1d52fd928063 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.402320] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-745529c6-7b34-4c06-b770-37d864805d88 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.433253] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cef6a1c-b992-4352-a4e1-7b499db058f2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.440723] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f4169bb-6bfb-49d9-a374-07914500b87b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.454010] env[68964]: DEBUG nova.compute.provider_tree [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 851.465975] env[68964]: DEBUG nova.scheduler.client.report [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 851.483696] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.501s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 851.484571] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 851.523020] env[68964]: DEBUG nova.compute.utils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 851.523773] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 851.524707] env[68964]: DEBUG nova.network.neutron [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 851.537917] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 851.597165] env[68964]: DEBUG nova.policy [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e48ab8f08df140f7b63e650e03327fd0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1364cfe6dafe4df283780738bce3c28f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 851.617772] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 851.645334] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 851.645817] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 851.645817] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 851.645929] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 851.646162] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 851.646371] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 851.646598] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 851.646855] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 851.646919] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 851.647076] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 851.647298] env[68964]: DEBUG nova.virt.hardware [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 851.648192] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eaac473d-40ae-4a91-9f03-a06a321b00e4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.657605] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-541bd051-dff7-4e66-8476-d9ba2add4de9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.108978] env[68964]: DEBUG nova.network.neutron [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Successfully created port: 85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 852.485675] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquiring lock "617f46a1-ca50-4561-9d0f-a596e35bf26d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 853.129720] env[68964]: DEBUG nova.network.neutron [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Successfully updated port: 85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 853.146218] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquiring lock "refresh_cache-323acb55-859a-4545-a046-1934cf98be6d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 853.146371] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquired lock "refresh_cache-323acb55-859a-4545-a046-1934cf98be6d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 853.146526] env[68964]: DEBUG nova.network.neutron [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 853.201819] env[68964]: DEBUG nova.network.neutron [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 853.269426] env[68964]: DEBUG nova.compute.manager [req-72f99f54-c78d-4bf5-abf8-7c3f69d2a38e req-62f5bdc0-3057-4f24-8d77-b4bb88c38c64 service nova] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Received event network-vif-plugged-85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 853.269426] env[68964]: DEBUG oslo_concurrency.lockutils [req-72f99f54-c78d-4bf5-abf8-7c3f69d2a38e req-62f5bdc0-3057-4f24-8d77-b4bb88c38c64 service nova] Acquiring lock "323acb55-859a-4545-a046-1934cf98be6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 853.269426] env[68964]: DEBUG oslo_concurrency.lockutils [req-72f99f54-c78d-4bf5-abf8-7c3f69d2a38e req-62f5bdc0-3057-4f24-8d77-b4bb88c38c64 service nova] Lock "323acb55-859a-4545-a046-1934cf98be6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 853.269426] env[68964]: DEBUG oslo_concurrency.lockutils [req-72f99f54-c78d-4bf5-abf8-7c3f69d2a38e req-62f5bdc0-3057-4f24-8d77-b4bb88c38c64 service nova] Lock "323acb55-859a-4545-a046-1934cf98be6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 853.269426] env[68964]: DEBUG nova.compute.manager [req-72f99f54-c78d-4bf5-abf8-7c3f69d2a38e req-62f5bdc0-3057-4f24-8d77-b4bb88c38c64 service nova] [instance: 323acb55-859a-4545-a046-1934cf98be6d] No waiting events found dispatching network-vif-plugged-85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 853.269426] env[68964]: WARNING nova.compute.manager [req-72f99f54-c78d-4bf5-abf8-7c3f69d2a38e req-62f5bdc0-3057-4f24-8d77-b4bb88c38c64 service nova] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Received unexpected event network-vif-plugged-85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1 for instance with vm_state building and task_state spawning. [ 853.443710] env[68964]: DEBUG nova.network.neutron [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Updating instance_info_cache with network_info: [{"id": "85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1", "address": "fa:16:3e:e4:5e:06", "network": {"id": "22adc17c-a700-43f3-8362-6cf07b3773bb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1239176632-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1364cfe6dafe4df283780738bce3c28f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b033f4d-2e92-4702-add6-410a29d3f251", "external-id": "nsx-vlan-transportzone-649", "segmentation_id": 649, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap85eb4b92-8b", "ovs_interfaceid": "85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 853.470034] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Releasing lock "refresh_cache-323acb55-859a-4545-a046-1934cf98be6d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 853.470034] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Instance network_info: |[{"id": "85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1", "address": "fa:16:3e:e4:5e:06", "network": {"id": "22adc17c-a700-43f3-8362-6cf07b3773bb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1239176632-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1364cfe6dafe4df283780738bce3c28f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b033f4d-2e92-4702-add6-410a29d3f251", "external-id": "nsx-vlan-transportzone-649", "segmentation_id": 649, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap85eb4b92-8b", "ovs_interfaceid": "85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 853.470034] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e4:5e:06', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4b033f4d-2e92-4702-add6-410a29d3f251', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 853.476904] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Creating folder: Project (1364cfe6dafe4df283780738bce3c28f). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 853.477997] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7254bd3a-a74d-4bf9-a8e0-14ea43e6ef1f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.491012] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Created folder: Project (1364cfe6dafe4df283780738bce3c28f) in parent group-v684465. [ 853.491012] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Creating folder: Instances. Parent ref: group-v684515. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 853.491012] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-153796df-1089-4414-b6db-fc8156ba555d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.497094] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Created folder: Instances in parent group-v684515. [ 853.497461] env[68964]: DEBUG oslo.service.loopingcall [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 853.497748] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 853.498074] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0debd807-e388-46d1-ac17-f39caa4099f4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.520907] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 853.520907] env[68964]: value = "task-3431568" [ 853.520907] env[68964]: _type = "Task" [ 853.520907] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 853.529033] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431568, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 854.034779] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431568, 'name': CreateVM_Task, 'duration_secs': 0.299901} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 854.034779] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 854.034779] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 854.034779] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 854.034779] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 854.034779] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-90e042a1-0477-4167-847c-3d0fa69306cf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 854.038986] env[68964]: DEBUG oslo_vmware.api [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Waiting for the task: (returnval){ [ 854.038986] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52393e9b-a09e-b630-ed96-773fd1c80839" [ 854.038986] env[68964]: _type = "Task" [ 854.038986] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 854.051038] env[68964]: DEBUG oslo_vmware.api [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52393e9b-a09e-b630-ed96-773fd1c80839, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 854.548657] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 854.548982] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 854.549370] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 855.340473] env[68964]: DEBUG nova.compute.manager [req-5cb7a3fc-27f3-4bc4-b078-8b8a33d062c7 req-22a1c6ed-e4f8-4c00-87dc-8207a1840737 service nova] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Received event network-changed-85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 855.340473] env[68964]: DEBUG nova.compute.manager [req-5cb7a3fc-27f3-4bc4-b078-8b8a33d062c7 req-22a1c6ed-e4f8-4c00-87dc-8207a1840737 service nova] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Refreshing instance network info cache due to event network-changed-85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 855.340473] env[68964]: DEBUG oslo_concurrency.lockutils [req-5cb7a3fc-27f3-4bc4-b078-8b8a33d062c7 req-22a1c6ed-e4f8-4c00-87dc-8207a1840737 service nova] Acquiring lock "refresh_cache-323acb55-859a-4545-a046-1934cf98be6d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 855.340661] env[68964]: DEBUG oslo_concurrency.lockutils [req-5cb7a3fc-27f3-4bc4-b078-8b8a33d062c7 req-22a1c6ed-e4f8-4c00-87dc-8207a1840737 service nova] Acquired lock "refresh_cache-323acb55-859a-4545-a046-1934cf98be6d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 855.340700] env[68964]: DEBUG nova.network.neutron [req-5cb7a3fc-27f3-4bc4-b078-8b8a33d062c7 req-22a1c6ed-e4f8-4c00-87dc-8207a1840737 service nova] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Refreshing network info cache for port 85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 855.747520] env[68964]: DEBUG nova.network.neutron [req-5cb7a3fc-27f3-4bc4-b078-8b8a33d062c7 req-22a1c6ed-e4f8-4c00-87dc-8207a1840737 service nova] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Updated VIF entry in instance network info cache for port 85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 855.747520] env[68964]: DEBUG nova.network.neutron [req-5cb7a3fc-27f3-4bc4-b078-8b8a33d062c7 req-22a1c6ed-e4f8-4c00-87dc-8207a1840737 service nova] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Updating instance_info_cache with network_info: [{"id": "85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1", "address": "fa:16:3e:e4:5e:06", "network": {"id": "22adc17c-a700-43f3-8362-6cf07b3773bb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1239176632-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1364cfe6dafe4df283780738bce3c28f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4b033f4d-2e92-4702-add6-410a29d3f251", "external-id": "nsx-vlan-transportzone-649", "segmentation_id": 649, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap85eb4b92-8b", "ovs_interfaceid": "85eb4b92-8b9c-47d9-94b3-8cd7e9fcb8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.758972] env[68964]: DEBUG oslo_concurrency.lockutils [req-5cb7a3fc-27f3-4bc4-b078-8b8a33d062c7 req-22a1c6ed-e4f8-4c00-87dc-8207a1840737 service nova] Releasing lock "refresh_cache-323acb55-859a-4545-a046-1934cf98be6d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 856.588685] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquiring lock "7f3f326c-2127-426e-a137-6f33512f4cb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 856.588876] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "7f3f326c-2127-426e-a137-6f33512f4cb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 858.543024] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquiring lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 862.454906] env[68964]: WARNING oslo_vmware.rw_handles [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 862.454906] env[68964]: ERROR oslo_vmware.rw_handles [ 862.455448] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/eb1c5082-7623-4c2c-9d22-6935e2d232ba/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 862.457019] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 862.457290] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Copying Virtual Disk [datastore1] vmware_temp/eb1c5082-7623-4c2c-9d22-6935e2d232ba/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/eb1c5082-7623-4c2c-9d22-6935e2d232ba/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 862.457639] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d7c0f979-cea2-4432-b11a-91f94ce8bee5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.464600] env[68964]: DEBUG oslo_vmware.api [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Waiting for the task: (returnval){ [ 862.464600] env[68964]: value = "task-3431569" [ 862.464600] env[68964]: _type = "Task" [ 862.464600] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 862.472464] env[68964]: DEBUG oslo_vmware.api [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Task: {'id': task-3431569, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 862.976105] env[68964]: DEBUG oslo_vmware.exceptions [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 862.976417] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 862.976947] env[68964]: ERROR nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 862.976947] env[68964]: Faults: ['InvalidArgument'] [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Traceback (most recent call last): [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] yield resources [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] self.driver.spawn(context, instance, image_meta, [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] self._fetch_image_if_missing(context, vi) [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] image_cache(vi, tmp_image_ds_loc) [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] vm_util.copy_virtual_disk( [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] session._wait_for_task(vmdk_copy_task) [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] return self.wait_for_task(task_ref) [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] return evt.wait() [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] result = hub.switch() [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] return self.greenlet.switch() [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] self.f(*self.args, **self.kw) [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] raise exceptions.translate_fault(task_info.error) [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Faults: ['InvalidArgument'] [ 862.976947] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] [ 862.977973] env[68964]: INFO nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Terminating instance [ 862.978826] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 862.980036] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 862.980036] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-888753c8-c72a-44f8-9216-abb18b56d056 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.981490] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 862.981681] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 862.982468] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0335920-3a3c-4db9-9b0a-87dbaa333153 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.989151] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 862.989373] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ae50a8b8-3b76-4c5f-a380-ced1ef487be9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.991538] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 862.991709] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 862.992648] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-49f8bf7a-0f17-4385-85e3-74f399701d5e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 862.997187] env[68964]: DEBUG oslo_vmware.api [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Waiting for the task: (returnval){ [ 862.997187] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5221d26d-c9da-47e4-9f97-ae13a0cd5c20" [ 862.997187] env[68964]: _type = "Task" [ 862.997187] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 863.004644] env[68964]: DEBUG oslo_vmware.api [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5221d26d-c9da-47e4-9f97-ae13a0cd5c20, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 863.466386] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 863.466656] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 863.466836] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Deleting the datastore file [datastore1] 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 863.467168] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3143695f-1ab7-43c5-98e8-a46ddce98cd6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.474351] env[68964]: DEBUG oslo_vmware.api [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Waiting for the task: (returnval){ [ 863.474351] env[68964]: value = "task-3431571" [ 863.474351] env[68964]: _type = "Task" [ 863.474351] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 863.482953] env[68964]: DEBUG oslo_vmware.api [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Task: {'id': task-3431571, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 863.507625] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 863.508603] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Creating directory with path [datastore1] vmware_temp/687c0606-0ff8-4ff0-9f4d-e408497df6de/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 863.508603] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1d93966b-faa5-48cb-91d3-032efdc8d1a6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.527939] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Created directory with path [datastore1] vmware_temp/687c0606-0ff8-4ff0-9f4d-e408497df6de/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 863.528166] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Fetch image to [datastore1] vmware_temp/687c0606-0ff8-4ff0-9f4d-e408497df6de/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 863.528329] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/687c0606-0ff8-4ff0-9f4d-e408497df6de/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 863.529098] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-919b57ed-cb6a-4807-93d4-554653e08244 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.536060] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b2a7db1-96ad-4387-9838-31d46963cf63 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.545389] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5047c5e-aaba-4b6b-a9a3-757c34b79ba6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.579405] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a86374ab-8056-43df-be15-5c6f409a9f3e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.585076] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-57eebb09-6002-4e64-a3f1-fa881f426bc8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 863.606126] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 863.657434] env[68964]: DEBUG oslo_vmware.rw_handles [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/687c0606-0ff8-4ff0-9f4d-e408497df6de/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 863.717522] env[68964]: DEBUG oslo_vmware.rw_handles [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 863.717729] env[68964]: DEBUG oslo_vmware.rw_handles [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/687c0606-0ff8-4ff0-9f4d-e408497df6de/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 863.984452] env[68964]: DEBUG oslo_vmware.api [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Task: {'id': task-3431571, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073259} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 863.984710] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 863.984891] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 863.985084] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 863.985341] env[68964]: INFO nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Took 1.00 seconds to destroy the instance on the hypervisor. [ 863.987348] env[68964]: DEBUG nova.compute.claims [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 863.987551] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 863.987861] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 864.414960] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-354558a3-3b21-4524-930d-3cedf0257cc3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.423778] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feabc501-a188-4eb2-b8e1-4591d006516f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.453533] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c922718-4848-4e1b-8a63-7392b6e4d3e9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.461021] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad5b34cd-f88f-4774-9d1a-b6e64e4663dd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 864.474221] env[68964]: DEBUG nova.compute.provider_tree [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 864.482469] env[68964]: DEBUG nova.scheduler.client.report [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 864.496699] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.508s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 864.496808] env[68964]: ERROR nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 864.496808] env[68964]: Faults: ['InvalidArgument'] [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Traceback (most recent call last): [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] self.driver.spawn(context, instance, image_meta, [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] self._fetch_image_if_missing(context, vi) [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] image_cache(vi, tmp_image_ds_loc) [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] vm_util.copy_virtual_disk( [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] session._wait_for_task(vmdk_copy_task) [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] return self.wait_for_task(task_ref) [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] return evt.wait() [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] result = hub.switch() [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] return self.greenlet.switch() [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] self.f(*self.args, **self.kw) [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] raise exceptions.translate_fault(task_info.error) [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Faults: ['InvalidArgument'] [ 864.496808] env[68964]: ERROR nova.compute.manager [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] [ 864.497697] env[68964]: DEBUG nova.compute.utils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 864.498935] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Build of instance 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5 was re-scheduled: A specified parameter was not correct: fileType [ 864.498935] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 864.499322] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 864.499494] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 864.499664] env[68964]: DEBUG nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 864.499827] env[68964]: DEBUG nova.network.neutron [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 864.923789] env[68964]: DEBUG nova.network.neutron [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 864.936156] env[68964]: INFO nova.compute.manager [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] [instance: 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5] Took 0.44 seconds to deallocate network for instance. [ 865.029342] env[68964]: INFO nova.scheduler.client.report [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Deleted allocations for instance 5b2a39da-1d95-4b2d-ab0e-8440f4544ef5 [ 865.047405] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a47847e-7718-478c-aa7c-5c87904f4754 tempest-ServerRescueTestJSON-870056286 tempest-ServerRescueTestJSON-870056286-project-member] Lock "5b2a39da-1d95-4b2d-ab0e-8440f4544ef5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 181.239s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 865.062506] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 865.109277] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 865.109452] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 865.110891] env[68964]: INFO nova.compute.claims [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 865.522184] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a06b6158-2386-42fd-8160-bc18a6368a5d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.529942] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff3c971f-4f21-47aa-b348-119e93abc0ad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.560323] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fda62729-2835-437f-bd3f-357685d61e34 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.567683] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e2b0f0c-32de-4896-991f-2ce46da1bc4b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.582098] env[68964]: DEBUG nova.compute.provider_tree [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 865.593854] env[68964]: DEBUG nova.scheduler.client.report [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 865.609636] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.500s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 865.610136] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 865.649876] env[68964]: DEBUG nova.compute.utils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 865.651339] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 865.651824] env[68964]: DEBUG nova.network.neutron [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 865.662903] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 865.723342] env[68964]: DEBUG nova.policy [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e16c5efbf3634d039bf57dc8feafcb56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0284577172914b56b74ece100e1584e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 865.743138] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 865.769387] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 865.769636] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 865.769795] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 865.769995] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 865.770338] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 865.770466] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 865.770682] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 865.770843] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 865.771013] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 865.771184] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 865.771354] env[68964]: DEBUG nova.virt.hardware [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 865.772265] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2afa2721-91a8-41cc-947d-23ac2df64cb5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 865.780603] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ea06c39-729b-4797-8a9b-34ccba01fa89 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 866.112385] env[68964]: DEBUG nova.network.neutron [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Successfully created port: cb232dd9-0113-4f53-a217-13ae199a6623 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 866.628080] env[68964]: DEBUG oslo_concurrency.lockutils [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquiring lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 867.095526] env[68964]: DEBUG nova.network.neutron [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Successfully updated port: cb232dd9-0113-4f53-a217-13ae199a6623 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 867.108454] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "refresh_cache-fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 867.108663] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquired lock "refresh_cache-fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 867.108786] env[68964]: DEBUG nova.network.neutron [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 867.169082] env[68964]: DEBUG nova.network.neutron [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 867.396965] env[68964]: DEBUG nova.network.neutron [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Updating instance_info_cache with network_info: [{"id": "cb232dd9-0113-4f53-a217-13ae199a6623", "address": "fa:16:3e:43:0d:49", "network": {"id": "a54b2f1d-5db7-4c69-b6fe-c1721675a5aa", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1034871949-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0284577172914b56b74ece100e1584e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1f996252-e329-42bd-a897-446dfe2b81cd", "external-id": "nsx-vlan-transportzone-535", "segmentation_id": 535, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcb232dd9-01", "ovs_interfaceid": "cb232dd9-0113-4f53-a217-13ae199a6623", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 867.412180] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Releasing lock "refresh_cache-fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 867.412488] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Instance network_info: |[{"id": "cb232dd9-0113-4f53-a217-13ae199a6623", "address": "fa:16:3e:43:0d:49", "network": {"id": "a54b2f1d-5db7-4c69-b6fe-c1721675a5aa", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1034871949-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0284577172914b56b74ece100e1584e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1f996252-e329-42bd-a897-446dfe2b81cd", "external-id": "nsx-vlan-transportzone-535", "segmentation_id": 535, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcb232dd9-01", "ovs_interfaceid": "cb232dd9-0113-4f53-a217-13ae199a6623", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 867.412903] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:43:0d:49', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1f996252-e329-42bd-a897-446dfe2b81cd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cb232dd9-0113-4f53-a217-13ae199a6623', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 867.420388] env[68964]: DEBUG oslo.service.loopingcall [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 867.420957] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 867.421110] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-abcd3018-32c3-4a22-8d50-9bfffecb3ec9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.442450] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 867.442450] env[68964]: value = "task-3431572" [ 867.442450] env[68964]: _type = "Task" [ 867.442450] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 867.450424] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431572, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 867.555856] env[68964]: DEBUG nova.compute.manager [req-d37c7409-7a2c-4f87-a9d7-549a1f696621 req-3a732822-d6a9-4374-8f57-cdcb25457938 service nova] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Received event network-vif-plugged-cb232dd9-0113-4f53-a217-13ae199a6623 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 867.556089] env[68964]: DEBUG oslo_concurrency.lockutils [req-d37c7409-7a2c-4f87-a9d7-549a1f696621 req-3a732822-d6a9-4374-8f57-cdcb25457938 service nova] Acquiring lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 867.556296] env[68964]: DEBUG oslo_concurrency.lockutils [req-d37c7409-7a2c-4f87-a9d7-549a1f696621 req-3a732822-d6a9-4374-8f57-cdcb25457938 service nova] Lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 867.556462] env[68964]: DEBUG oslo_concurrency.lockutils [req-d37c7409-7a2c-4f87-a9d7-549a1f696621 req-3a732822-d6a9-4374-8f57-cdcb25457938 service nova] Lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 867.556688] env[68964]: DEBUG nova.compute.manager [req-d37c7409-7a2c-4f87-a9d7-549a1f696621 req-3a732822-d6a9-4374-8f57-cdcb25457938 service nova] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] No waiting events found dispatching network-vif-plugged-cb232dd9-0113-4f53-a217-13ae199a6623 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 867.556864] env[68964]: WARNING nova.compute.manager [req-d37c7409-7a2c-4f87-a9d7-549a1f696621 req-3a732822-d6a9-4374-8f57-cdcb25457938 service nova] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Received unexpected event network-vif-plugged-cb232dd9-0113-4f53-a217-13ae199a6623 for instance with vm_state building and task_state spawning. [ 867.757731] env[68964]: DEBUG oslo_concurrency.lockutils [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquiring lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 867.951860] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431572, 'name': CreateVM_Task, 'duration_secs': 0.283514} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 867.952178] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 867.953190] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 867.953190] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 867.953328] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 867.953568] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bfcc8833-3eb3-4a02-b039-04dd0ecc7c33 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.958762] env[68964]: DEBUG oslo_vmware.api [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for the task: (returnval){ [ 867.958762] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]528810b7-cceb-8b45-2328-8d6866b6609e" [ 867.958762] env[68964]: _type = "Task" [ 867.958762] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 867.966358] env[68964]: DEBUG oslo_vmware.api [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]528810b7-cceb-8b45-2328-8d6866b6609e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 868.471778] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 868.472046] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 868.472260] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 869.781021] env[68964]: DEBUG nova.compute.manager [req-10b25634-d96e-47c1-8aa4-97a0117440b9 req-324ec193-4a6c-4598-9dbd-455ea96217e6 service nova] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Received event network-changed-cb232dd9-0113-4f53-a217-13ae199a6623 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 869.781262] env[68964]: DEBUG nova.compute.manager [req-10b25634-d96e-47c1-8aa4-97a0117440b9 req-324ec193-4a6c-4598-9dbd-455ea96217e6 service nova] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Refreshing instance network info cache due to event network-changed-cb232dd9-0113-4f53-a217-13ae199a6623. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 869.781456] env[68964]: DEBUG oslo_concurrency.lockutils [req-10b25634-d96e-47c1-8aa4-97a0117440b9 req-324ec193-4a6c-4598-9dbd-455ea96217e6 service nova] Acquiring lock "refresh_cache-fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 869.781721] env[68964]: DEBUG oslo_concurrency.lockutils [req-10b25634-d96e-47c1-8aa4-97a0117440b9 req-324ec193-4a6c-4598-9dbd-455ea96217e6 service nova] Acquired lock "refresh_cache-fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 869.781771] env[68964]: DEBUG nova.network.neutron [req-10b25634-d96e-47c1-8aa4-97a0117440b9 req-324ec193-4a6c-4598-9dbd-455ea96217e6 service nova] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Refreshing network info cache for port cb232dd9-0113-4f53-a217-13ae199a6623 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 870.093626] env[68964]: DEBUG nova.network.neutron [req-10b25634-d96e-47c1-8aa4-97a0117440b9 req-324ec193-4a6c-4598-9dbd-455ea96217e6 service nova] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Updated VIF entry in instance network info cache for port cb232dd9-0113-4f53-a217-13ae199a6623. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 870.093987] env[68964]: DEBUG nova.network.neutron [req-10b25634-d96e-47c1-8aa4-97a0117440b9 req-324ec193-4a6c-4598-9dbd-455ea96217e6 service nova] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Updating instance_info_cache with network_info: [{"id": "cb232dd9-0113-4f53-a217-13ae199a6623", "address": "fa:16:3e:43:0d:49", "network": {"id": "a54b2f1d-5db7-4c69-b6fe-c1721675a5aa", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1034871949-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0284577172914b56b74ece100e1584e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1f996252-e329-42bd-a897-446dfe2b81cd", "external-id": "nsx-vlan-transportzone-535", "segmentation_id": 535, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcb232dd9-01", "ovs_interfaceid": "cb232dd9-0113-4f53-a217-13ae199a6623", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 870.105893] env[68964]: DEBUG oslo_concurrency.lockutils [req-10b25634-d96e-47c1-8aa4-97a0117440b9 req-324ec193-4a6c-4598-9dbd-455ea96217e6 service nova] Releasing lock "refresh_cache-fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 873.214483] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquiring lock "2d0469ba-ad42-4b06-ade2-cd64487278c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 873.214814] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 873.985889] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquiring lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 875.755031] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "329835df-cb38-495e-8a0e-539a396ddc74" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 883.412151] env[68964]: DEBUG oslo_concurrency.lockutils [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquiring lock "323acb55-859a-4545-a046-1934cf98be6d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 883.862227] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquiring lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 884.665688] env[68964]: DEBUG oslo_concurrency.lockutils [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 888.168806] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquiring lock "238794bb-9995-4bb0-954d-7ca0ef825e19" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 888.168806] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 888.451077] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b1d4e238-f8c8-4a77-94c3-50f8bd6cb5c7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "8b11bda8-2923-4641-869b-39e4fce369b4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 888.451372] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b1d4e238-f8c8-4a77-94c3-50f8bd6cb5c7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "8b11bda8-2923-4641-869b-39e4fce369b4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 894.961727] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5e66c901-f711-434a-ad0f-8f298c733592 tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] Acquiring lock "d7af5b42-41ea-4cac-b9ae-66ec4b6d25f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 894.962020] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5e66c901-f711-434a-ad0f-8f298c733592 tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] Lock "d7af5b42-41ea-4cac-b9ae-66ec4b6d25f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 895.643124] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ba3845ee-bf0e-480e-b0ed-bff5d17c6ecf tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] Acquiring lock "120c1330-9cdf-4db2-8c9f-1fa08dcad359" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 895.643277] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ba3845ee-bf0e-480e-b0ed-bff5d17c6ecf tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] Lock "120c1330-9cdf-4db2-8c9f-1fa08dcad359" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 899.725618] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 899.725903] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 899.741452] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] There are 0 instances to clean {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 899.741805] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 899.741980] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances with incomplete migration {{(pid=68964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 899.759154] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 899.944930] env[68964]: WARNING oslo_vmware.rw_handles [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 899.944930] env[68964]: ERROR oslo_vmware.rw_handles [ 899.945406] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/7c0df2a1-dbba-4d06-bba7-128ccc0b34a1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 899.948182] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 899.948491] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Copying Virtual Disk [datastore2] vmware_temp/7c0df2a1-dbba-4d06-bba7-128ccc0b34a1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/7c0df2a1-dbba-4d06-bba7-128ccc0b34a1/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 899.948808] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8fe0a86e-64b5-4957-8079-37b151a2e1e9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.957169] env[68964]: DEBUG oslo_vmware.api [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Waiting for the task: (returnval){ [ 899.957169] env[68964]: value = "task-3431573" [ 899.957169] env[68964]: _type = "Task" [ 899.957169] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 899.965145] env[68964]: DEBUG oslo_vmware.api [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Task: {'id': task-3431573, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 900.467074] env[68964]: DEBUG oslo_vmware.exceptions [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 900.467359] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 900.467890] env[68964]: ERROR nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 900.467890] env[68964]: Faults: ['InvalidArgument'] [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Traceback (most recent call last): [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] yield resources [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] self.driver.spawn(context, instance, image_meta, [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] self._fetch_image_if_missing(context, vi) [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] image_cache(vi, tmp_image_ds_loc) [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] vm_util.copy_virtual_disk( [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] session._wait_for_task(vmdk_copy_task) [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] return self.wait_for_task(task_ref) [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] return evt.wait() [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] result = hub.switch() [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] return self.greenlet.switch() [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] self.f(*self.args, **self.kw) [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] raise exceptions.translate_fault(task_info.error) [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Faults: ['InvalidArgument'] [ 900.467890] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] [ 900.468873] env[68964]: INFO nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Terminating instance [ 900.469780] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 900.469990] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 900.470234] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-875a7373-9088-4cbc-99e2-ff87bb875813 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.472331] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 900.472524] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 900.473250] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-653db6a7-5146-4eb3-bbbf-01722f4d0ed5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.480073] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 900.480304] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-65dd9b90-b39d-4b25-93d0-2f9a45acba56 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.483772] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 900.483772] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 900.484334] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-353bd0b7-f295-45b3-bb1d-bc0fbfc66cc8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.488564] env[68964]: DEBUG oslo_vmware.api [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Waiting for the task: (returnval){ [ 900.488564] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]520dcecb-0959-524d-deea-55dd9dcb8331" [ 900.488564] env[68964]: _type = "Task" [ 900.488564] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 900.497300] env[68964]: DEBUG oslo_vmware.api [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]520dcecb-0959-524d-deea-55dd9dcb8331, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 900.545316] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 900.545542] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 900.545722] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Deleting the datastore file [datastore2] 617f46a1-ca50-4561-9d0f-a596e35bf26d {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 900.545979] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f131633a-18a5-437f-8f7f-ed55d9e1003c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.552387] env[68964]: DEBUG oslo_vmware.api [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Waiting for the task: (returnval){ [ 900.552387] env[68964]: value = "task-3431575" [ 900.552387] env[68964]: _type = "Task" [ 900.552387] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 900.564054] env[68964]: DEBUG oslo_vmware.api [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Task: {'id': task-3431575, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 900.999013] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 900.999318] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Creating directory with path [datastore2] vmware_temp/c23d6124-cc57-4d95-975a-b190ce222580/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 900.999559] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-15284464-97fd-4bd5-9afb-54e4c132845e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.011270] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Created directory with path [datastore2] vmware_temp/c23d6124-cc57-4d95-975a-b190ce222580/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 901.011472] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Fetch image to [datastore2] vmware_temp/c23d6124-cc57-4d95-975a-b190ce222580/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 901.011638] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/c23d6124-cc57-4d95-975a-b190ce222580/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 901.012405] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59c218fb-f7b6-4320-a368-b0a0d09c7a98 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.019094] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bccadd27-4d96-4f8c-bb78-284f3615ec7c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.028113] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a09b7ec1-4c7b-4c51-b19a-3dece214ed2f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.063704] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08ce3cc5-ee89-4d4d-82d8-c907be0c93aa {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.070744] env[68964]: DEBUG oslo_vmware.api [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Task: {'id': task-3431575, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075997} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 901.072153] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 901.072345] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 901.072518] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 901.072691] env[68964]: INFO nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 901.074595] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bd4009e1-022e-46ca-a165-2d85264734a6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.076579] env[68964]: DEBUG nova.compute.claims [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 901.076684] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 901.076829] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 901.098532] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 901.153102] env[68964]: DEBUG oslo_vmware.rw_handles [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c23d6124-cc57-4d95-975a-b190ce222580/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 901.213312] env[68964]: DEBUG oslo_vmware.rw_handles [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 901.213391] env[68964]: DEBUG oslo_vmware.rw_handles [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c23d6124-cc57-4d95-975a-b190ce222580/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 901.581640] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-650cd82f-babe-4c30-88fd-6dc0fc031de5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.589092] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6a181cf-289e-4d00-892a-c7699adaec3b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.619341] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33e71b90-f0b5-4a03-8094-d06702788fbd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.626959] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29e9197a-d40f-4d96-a9c2-db66eded32d2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.642282] env[68964]: DEBUG nova.compute.provider_tree [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 901.652845] env[68964]: DEBUG nova.scheduler.client.report [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 901.669990] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.593s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 901.670516] env[68964]: ERROR nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 901.670516] env[68964]: Faults: ['InvalidArgument'] [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Traceback (most recent call last): [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] self.driver.spawn(context, instance, image_meta, [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] self._fetch_image_if_missing(context, vi) [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] image_cache(vi, tmp_image_ds_loc) [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] vm_util.copy_virtual_disk( [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] session._wait_for_task(vmdk_copy_task) [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] return self.wait_for_task(task_ref) [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] return evt.wait() [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] result = hub.switch() [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] return self.greenlet.switch() [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] self.f(*self.args, **self.kw) [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] raise exceptions.translate_fault(task_info.error) [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Faults: ['InvalidArgument'] [ 901.670516] env[68964]: ERROR nova.compute.manager [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] [ 901.671281] env[68964]: DEBUG nova.compute.utils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 901.672697] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Build of instance 617f46a1-ca50-4561-9d0f-a596e35bf26d was re-scheduled: A specified parameter was not correct: fileType [ 901.672697] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 901.673067] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 901.673239] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 901.673412] env[68964]: DEBUG nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 901.673566] env[68964]: DEBUG nova.network.neutron [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 901.761288] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 901.761288] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 901.761288] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 901.761288] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 901.772264] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 901.772520] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 901.772716] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 901.772900] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 901.774008] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43175ef0-2907-4fe3-9d91-eb6985cf4fe0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.782743] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e73a5d0e-f638-46db-87ee-1b704ae88b5b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.798303] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7f30871-6b06-4207-b9b2-14dba89be7bd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.804944] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a38ff11-0e73-45b2-923c-93f5879f6f3e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.836984] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180945MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 901.837160] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 901.837361] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 617f46a1-ca50-4561-9d0f-a596e35bf26d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 329835df-cb38-495e-8a0e-539a396ddc74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ff09bdbb-84e3-4182-8118-e99512a0e9de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cd99acb8-f78c-4c03-8c2e-2e9d50d18969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 323acb55-859a-4545-a046-1934cf98be6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 901.928022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fe587d8a-aa99-4163-a2a2-a97c0cfbb82a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 901.942651] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f37297e4-80b0-4c1d-b427-6a6a235bdc57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 901.955210] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8370a744-2602-410e-a509-e8487810e266 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 901.966145] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance d55ab849-9aa1-4d65-9a30-7c5eeb77d9a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 901.975923] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 0c909272-30a0-40b7-ad1d-90933925ff6f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 901.986932] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3005e937-65d2-4e41-8dd7-2fecaaa15365 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.001745] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.012164] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7c21c92e-16ed-4e2c-90d5-9391b1eeb703 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.024422] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 749fc36c-c3de-4762-bae7-515dec3c7377 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.037161] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 04e18d39-9cf6-4c0e-ae33-29e955827571 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.050692] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 92c1d7af-79e0-4cd9-a7e5-a969b4843778 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.066305] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4726af42-5678-4b56-8675-76e30156feaa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.081814] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb007e0d-124f-4ef6-85d7-c68b310e8b9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.091339] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 58afa2a4-da8c-4b32-9c76-587d082de444 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.105257] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 722f8bf7-1634-4190-9cc0-49b2a28c367e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.120433] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3770333e-4721-424d-ac86-2291c002e99a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.134496] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.148555] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.160602] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f3f326c-2127-426e-a137-6f33512f4cb2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.176484] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.191873] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 238794bb-9995-4bb0-954d-7ca0ef825e19 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.206225] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8b11bda8-2923-4641-869b-39e4fce369b4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.217904] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance d7af5b42-41ea-4cac-b9ae-66ec4b6d25f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.227973] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 120c1330-9cdf-4db2-8c9f-1fa08dcad359 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 902.228239] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 902.228414] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 902.284315] env[68964]: DEBUG nova.network.neutron [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 902.294539] env[68964]: INFO nova.compute.manager [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Took 0.62 seconds to deallocate network for instance. [ 902.389955] env[68964]: INFO nova.scheduler.client.report [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Deleted allocations for instance 617f46a1-ca50-4561-9d0f-a596e35bf26d [ 902.422015] env[68964]: DEBUG oslo_concurrency.lockutils [None req-fc4287a9-82ba-46aa-9eb4-6d84d960eb33 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "617f46a1-ca50-4561-9d0f-a596e35bf26d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 248.124s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 902.423107] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "617f46a1-ca50-4561-9d0f-a596e35bf26d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 49.938s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 902.423372] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Acquiring lock "617f46a1-ca50-4561-9d0f-a596e35bf26d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 902.423610] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "617f46a1-ca50-4561-9d0f-a596e35bf26d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 902.424125] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "617f46a1-ca50-4561-9d0f-a596e35bf26d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 902.426302] env[68964]: INFO nova.compute.manager [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Terminating instance [ 902.428097] env[68964]: DEBUG nova.compute.manager [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 902.428365] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 902.428850] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-09402148-9aee-4489-9854-fca4ba832343 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 902.435034] env[68964]: DEBUG nova.compute.manager [None req-d85f4462-2b04-4df3-b5a8-2e69ddc49452 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: fa867511-44ae-47e6-8c05-5f2abf8eae88] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 902.444159] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93e6926e-0a84-4d52-a01f-7ca668a2f43c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 902.466868] env[68964]: DEBUG nova.compute.manager [None req-d85f4462-2b04-4df3-b5a8-2e69ddc49452 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: fa867511-44ae-47e6-8c05-5f2abf8eae88] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 902.479526] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 617f46a1-ca50-4561-9d0f-a596e35bf26d could not be found. [ 902.479787] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 902.480018] env[68964]: INFO nova.compute.manager [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Took 0.05 seconds to destroy the instance on the hypervisor. [ 902.480455] env[68964]: DEBUG oslo.service.loopingcall [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 902.482904] env[68964]: DEBUG nova.compute.manager [-] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 902.483027] env[68964]: DEBUG nova.network.neutron [-] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 902.495624] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d85f4462-2b04-4df3-b5a8-2e69ddc49452 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "fa867511-44ae-47e6-8c05-5f2abf8eae88" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.824s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 902.511467] env[68964]: DEBUG nova.compute.manager [None req-77fc965e-a8ce-4e2a-be06-01367174d032 tempest-InstanceActionsV221TestJSON-849543650 tempest-InstanceActionsV221TestJSON-849543650-project-member] [instance: 888fb47f-5f48-415c-9289-61b9c42523e5] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 902.517021] env[68964]: DEBUG nova.network.neutron [-] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 902.523293] env[68964]: INFO nova.compute.manager [-] [instance: 617f46a1-ca50-4561-9d0f-a596e35bf26d] Took 0.04 seconds to deallocate network for instance. [ 902.556102] env[68964]: DEBUG nova.compute.manager [None req-77fc965e-a8ce-4e2a-be06-01367174d032 tempest-InstanceActionsV221TestJSON-849543650 tempest-InstanceActionsV221TestJSON-849543650-project-member] [instance: 888fb47f-5f48-415c-9289-61b9c42523e5] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 902.585125] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77fc965e-a8ce-4e2a-be06-01367174d032 tempest-InstanceActionsV221TestJSON-849543650 tempest-InstanceActionsV221TestJSON-849543650-project-member] Lock "888fb47f-5f48-415c-9289-61b9c42523e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.870s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 902.596871] env[68964]: DEBUG nova.compute.manager [None req-77e3bc7b-6647-4893-85cf-45f775e04cfc tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: c3ad57a5-1ea2-484a-b014-6276e0ee7914] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 902.629666] env[68964]: DEBUG nova.compute.manager [None req-77e3bc7b-6647-4893-85cf-45f775e04cfc tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: c3ad57a5-1ea2-484a-b014-6276e0ee7914] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 902.656203] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77e3bc7b-6647-4893-85cf-45f775e04cfc tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "c3ad57a5-1ea2-484a-b014-6276e0ee7914" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.143s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 902.680772] env[68964]: DEBUG nova.compute.manager [None req-feef32b4-f1f8-4a9f-8191-d9b30f2f25a6 tempest-ServerExternalEventsTest-282581830 tempest-ServerExternalEventsTest-282581830-project-member] [instance: 7dbca935-17b3-4a4b-ae3e-558bc802f9b1] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 902.683118] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1454141e-f7d7-46ab-8519-1eba659440b6 tempest-ServersTestJSON-1233566270 tempest-ServersTestJSON-1233566270-project-member] Lock "617f46a1-ca50-4561-9d0f-a596e35bf26d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.260s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 902.706914] env[68964]: DEBUG nova.compute.manager [None req-feef32b4-f1f8-4a9f-8191-d9b30f2f25a6 tempest-ServerExternalEventsTest-282581830 tempest-ServerExternalEventsTest-282581830-project-member] [instance: 7dbca935-17b3-4a4b-ae3e-558bc802f9b1] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 902.725583] env[68964]: DEBUG oslo_concurrency.lockutils [None req-feef32b4-f1f8-4a9f-8191-d9b30f2f25a6 tempest-ServerExternalEventsTest-282581830 tempest-ServerExternalEventsTest-282581830-project-member] Lock "7dbca935-17b3-4a4b-ae3e-558bc802f9b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.895s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 902.743020] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 902.789115] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 902.823888] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7d3e6b8-a768-4063-aabd-a62c2fe612fc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 902.830394] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-951926ed-ad3c-4d5b-b9ed-1a962dc098a7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 902.863997] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12d53886-201c-4e6a-a169-8698b15832f7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 902.871730] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5288c1b2-dc5d-4347-a170-26e0a588c6bf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 902.885394] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 902.894063] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 902.908030] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 902.908268] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.071s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 902.908567] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.120s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 902.910025] env[68964]: INFO nova.compute.claims [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 903.354717] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0869a7f8-4cfb-476b-8525-ce1130f6fc94 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.362617] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b08d4bb8-28d9-429c-af46-18e25ef459d9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.395339] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c08434ea-43e4-4dba-8a31-41d1068cca85 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.402973] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6263163-16c6-4f37-867d-928c6b1c92ca {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.421090] env[68964]: DEBUG nova.compute.provider_tree [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 903.432766] env[68964]: DEBUG nova.scheduler.client.report [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 903.449312] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.541s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 903.449994] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 903.487805] env[68964]: DEBUG nova.compute.utils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 903.489092] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Not allocating networking since 'none' was specified. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 903.498347] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 903.560653] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 903.590364] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 903.590607] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 903.590770] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 903.591451] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 903.591451] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 903.591552] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 903.591759] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 903.591857] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 903.592022] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 903.592270] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 903.592446] env[68964]: DEBUG nova.virt.hardware [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 903.593656] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06e3eb5e-adee-47cb-b09f-4037c4cff6f9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.606336] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96b336aa-5113-4fa7-a80c-dc713dba04b1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.628645] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Instance VIF info [] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 903.637588] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Creating folder: Project (08c306c477f945968539f658b3775fbd). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 903.638380] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-769f8b42-b9bb-4216-b50d-b1f0dcfb2052 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.648220] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Created folder: Project (08c306c477f945968539f658b3775fbd) in parent group-v684465. [ 903.648507] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Creating folder: Instances. Parent ref: group-v684519. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 903.648696] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-caee329d-1a79-43e1-90d7-6c150aeefd9e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.657520] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Created folder: Instances in parent group-v684519. [ 903.657585] env[68964]: DEBUG oslo.service.loopingcall [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 903.657737] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 903.657940] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4048be65-34f6-43c3-a960-d0538fbb84f1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 903.676373] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 903.676373] env[68964]: value = "task-3431578" [ 903.676373] env[68964]: _type = "Task" [ 903.676373] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 903.684392] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431578, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 903.875735] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 903.876008] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 903.876194] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 903.876366] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 903.876522] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 904.195538] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431578, 'name': CreateVM_Task, 'duration_secs': 0.252512} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 904.195538] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 904.195538] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 904.195538] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 904.195538] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 904.195538] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-19fb43dd-869e-453b-abd3-6fbb13a379a0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 904.200019] env[68964]: DEBUG oslo_vmware.api [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Waiting for the task: (returnval){ [ 904.200019] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525120d4-0d6c-69a2-cc43-b9b220c85170" [ 904.200019] env[68964]: _type = "Task" [ 904.200019] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 904.206485] env[68964]: DEBUG oslo_vmware.api [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525120d4-0d6c-69a2-cc43-b9b220c85170, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 904.709252] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 904.709615] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 904.709747] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 904.725406] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 904.725504] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 904.725560] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 904.757976] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.757976] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.758142] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.758265] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.758457] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.758608] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.758735] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.758857] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.758977] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.759112] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 904.759236] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 908.026505] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "4d272615-e2dd-4540-88d0-4a209f559147" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 908.026784] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "4d272615-e2dd-4540-88d0-4a209f559147" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 910.309102] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "f37297e4-80b0-4c1d-b427-6a6a235bdc57" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 912.963639] env[68964]: WARNING oslo_vmware.rw_handles [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 912.963639] env[68964]: ERROR oslo_vmware.rw_handles [ 912.964164] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/687c0606-0ff8-4ff0-9f4d-e408497df6de/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 912.965774] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 912.966046] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Copying Virtual Disk [datastore1] vmware_temp/687c0606-0ff8-4ff0-9f4d-e408497df6de/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/687c0606-0ff8-4ff0-9f4d-e408497df6de/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 912.966347] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-45660bbc-2961-4a62-b8ab-94e20d07f1cb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 912.975242] env[68964]: DEBUG oslo_vmware.api [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Waiting for the task: (returnval){ [ 912.975242] env[68964]: value = "task-3431579" [ 912.975242] env[68964]: _type = "Task" [ 912.975242] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 912.983689] env[68964]: DEBUG oslo_vmware.api [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Task: {'id': task-3431579, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 913.485665] env[68964]: DEBUG oslo_vmware.exceptions [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 913.486033] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 913.486599] env[68964]: ERROR nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 913.486599] env[68964]: Faults: ['InvalidArgument'] [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Traceback (most recent call last): [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] yield resources [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] self.driver.spawn(context, instance, image_meta, [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] self._vmops.spawn(context, instance, image_meta, injected_files, [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] self._fetch_image_if_missing(context, vi) [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] image_cache(vi, tmp_image_ds_loc) [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] vm_util.copy_virtual_disk( [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] session._wait_for_task(vmdk_copy_task) [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] return self.wait_for_task(task_ref) [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] return evt.wait() [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] result = hub.switch() [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] return self.greenlet.switch() [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] self.f(*self.args, **self.kw) [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] raise exceptions.translate_fault(task_info.error) [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Faults: ['InvalidArgument'] [ 913.486599] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] [ 913.487579] env[68964]: INFO nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Terminating instance [ 913.488470] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 913.488728] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 913.489430] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 913.489593] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 913.489816] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f9285da9-38b0-43cf-80e0-83d389925729 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 913.492463] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ade5f0b1-aa02-4803-b3ac-61554afdb60f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 913.499260] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 913.499482] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-929d189a-0385-496d-beec-96f48b19e6db {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 913.501681] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 913.501853] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 913.502830] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-589af6e6-249f-499c-bd6a-f1f457360572 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 913.507667] env[68964]: DEBUG oslo_vmware.api [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Waiting for the task: (returnval){ [ 913.507667] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]522fc8cc-f85a-049e-2b5e-24f465c1f569" [ 913.507667] env[68964]: _type = "Task" [ 913.507667] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 913.514751] env[68964]: DEBUG oslo_vmware.api [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]522fc8cc-f85a-049e-2b5e-24f465c1f569, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 913.573484] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 913.573653] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 913.573814] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Deleting the datastore file [datastore1] cd99acb8-f78c-4c03-8c2e-2e9d50d18969 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 913.574215] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8f842c4e-ad6e-4bc3-87c7-77438017d48a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 913.580348] env[68964]: DEBUG oslo_vmware.api [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Waiting for the task: (returnval){ [ 913.580348] env[68964]: value = "task-3431581" [ 913.580348] env[68964]: _type = "Task" [ 913.580348] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 913.588581] env[68964]: DEBUG oslo_vmware.api [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Task: {'id': task-3431581, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 914.018668] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 914.018896] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Creating directory with path [datastore1] vmware_temp/6a8877ac-a1e2-4c33-a063-d1c47af8db8c/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 914.019136] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aef6a9bd-7f75-4fd0-84ee-c0959870de99 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.030839] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Created directory with path [datastore1] vmware_temp/6a8877ac-a1e2-4c33-a063-d1c47af8db8c/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 914.031051] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Fetch image to [datastore1] vmware_temp/6a8877ac-a1e2-4c33-a063-d1c47af8db8c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 914.031227] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/6a8877ac-a1e2-4c33-a063-d1c47af8db8c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 914.031968] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f125183-ebd2-4854-a524-83edfa8ac329 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.038751] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c2214c-77ad-4950-9c04-1a213188144e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.048071] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc0cd4d3-885b-4f67-b3ec-8bc0d160f1c1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.079665] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-717e9c23-beb6-4eb5-817f-56dc69e5ac26 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.091068] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8d120078-e623-43ca-94dd-f51e2a926bc5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.093204] env[68964]: DEBUG oslo_vmware.api [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Task: {'id': task-3431581, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071177} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 914.093304] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 914.093436] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 914.093609] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 914.093779] env[68964]: INFO nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Took 0.60 seconds to destroy the instance on the hypervisor. [ 914.095843] env[68964]: DEBUG nova.compute.claims [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 914.096065] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 914.096288] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 914.114379] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 914.163980] env[68964]: DEBUG nova.scheduler.client.report [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Refreshing inventories for resource provider 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 914.166934] env[68964]: DEBUG oslo_vmware.rw_handles [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6a8877ac-a1e2-4c33-a063-d1c47af8db8c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 914.227012] env[68964]: DEBUG nova.scheduler.client.report [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Updating ProviderTree inventory for provider 63b0294e-f555-48a6-a542-3466427066a9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 914.227258] env[68964]: DEBUG nova.compute.provider_tree [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Updating inventory in ProviderTree for provider 63b0294e-f555-48a6-a542-3466427066a9 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 914.231558] env[68964]: DEBUG oslo_vmware.rw_handles [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 914.231662] env[68964]: DEBUG oslo_vmware.rw_handles [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6a8877ac-a1e2-4c33-a063-d1c47af8db8c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 914.240623] env[68964]: DEBUG nova.scheduler.client.report [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Refreshing aggregate associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, aggregates: None {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 914.261345] env[68964]: DEBUG nova.scheduler.client.report [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Refreshing trait associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 914.641217] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4efb5b89-3886-4ad1-96b7-e06dace15bef {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.649199] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4860f66b-4e2b-44e4-8012-173e0bb7b81c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.680317] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c883993a-e242-4422-b075-37f5f3b8b9ef {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.687525] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2f8a652-2dfd-4339-b28f-27e7c40f17b2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.700582] env[68964]: DEBUG nova.compute.provider_tree [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 914.708938] env[68964]: DEBUG nova.scheduler.client.report [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 914.725331] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.629s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 914.726883] env[68964]: ERROR nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 914.726883] env[68964]: Faults: ['InvalidArgument'] [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Traceback (most recent call last): [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] self.driver.spawn(context, instance, image_meta, [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] self._vmops.spawn(context, instance, image_meta, injected_files, [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] self._fetch_image_if_missing(context, vi) [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] image_cache(vi, tmp_image_ds_loc) [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] vm_util.copy_virtual_disk( [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] session._wait_for_task(vmdk_copy_task) [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] return self.wait_for_task(task_ref) [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] return evt.wait() [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] result = hub.switch() [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] return self.greenlet.switch() [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] self.f(*self.args, **self.kw) [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] raise exceptions.translate_fault(task_info.error) [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Faults: ['InvalidArgument'] [ 914.726883] env[68964]: ERROR nova.compute.manager [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] [ 914.726883] env[68964]: DEBUG nova.compute.utils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 914.729399] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Build of instance cd99acb8-f78c-4c03-8c2e-2e9d50d18969 was re-scheduled: A specified parameter was not correct: fileType [ 914.729399] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 914.729399] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 914.729399] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 914.729399] env[68964]: DEBUG nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 914.729399] env[68964]: DEBUG nova.network.neutron [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 915.199058] env[68964]: DEBUG nova.network.neutron [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 915.211307] env[68964]: INFO nova.compute.manager [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Took 0.48 seconds to deallocate network for instance. [ 915.329554] env[68964]: INFO nova.scheduler.client.report [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Deleted allocations for instance cd99acb8-f78c-4c03-8c2e-2e9d50d18969 [ 915.350430] env[68964]: DEBUG oslo_concurrency.lockutils [None req-4ab26416-718f-42b4-ae61-74cbc97f417b tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.795s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 915.351581] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 31.490s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 915.351805] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Acquiring lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 915.352014] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 915.352197] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 915.356376] env[68964]: INFO nova.compute.manager [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Terminating instance [ 915.357286] env[68964]: DEBUG nova.compute.manager [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 915.357999] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 915.358750] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ccdc1f18-754e-4fec-89ca-5b5e8b42fca8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.372144] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b108989-26bf-4c49-91b6-0dfa44a80f2c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.382693] env[68964]: DEBUG nova.compute.manager [None req-a957c0b7-5996-4d87-a63f-305c20176b3a tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 8370a744-2602-410e-a509-e8487810e266] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 915.407906] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cd99acb8-f78c-4c03-8c2e-2e9d50d18969 could not be found. [ 915.407906] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 915.407906] env[68964]: INFO nova.compute.manager [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Took 0.05 seconds to destroy the instance on the hypervisor. [ 915.407906] env[68964]: DEBUG oslo.service.loopingcall [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 915.408173] env[68964]: DEBUG nova.compute.manager [-] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 915.408280] env[68964]: DEBUG nova.network.neutron [-] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 915.413687] env[68964]: DEBUG nova.compute.manager [None req-a957c0b7-5996-4d87-a63f-305c20176b3a tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 8370a744-2602-410e-a509-e8487810e266] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 915.442905] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a957c0b7-5996-4d87-a63f-305c20176b3a tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "8370a744-2602-410e-a509-e8487810e266" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.079s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 915.446933] env[68964]: DEBUG nova.network.neutron [-] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 915.453534] env[68964]: DEBUG nova.compute.manager [None req-9eb9ae95-cb11-4471-9cea-3940f4fa39dc tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: d55ab849-9aa1-4d65-9a30-7c5eeb77d9a9] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 915.457197] env[68964]: INFO nova.compute.manager [-] [instance: cd99acb8-f78c-4c03-8c2e-2e9d50d18969] Took 0.05 seconds to deallocate network for instance. [ 915.482860] env[68964]: DEBUG nova.compute.manager [None req-9eb9ae95-cb11-4471-9cea-3940f4fa39dc tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: d55ab849-9aa1-4d65-9a30-7c5eeb77d9a9] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 915.503094] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9eb9ae95-cb11-4471-9cea-3940f4fa39dc tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "d55ab849-9aa1-4d65-9a30-7c5eeb77d9a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.876s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 915.518488] env[68964]: DEBUG nova.compute.manager [None req-8525e0d5-9df0-4466-b444-d8e0a48204f6 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: 0c909272-30a0-40b7-ad1d-90933925ff6f] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 915.547355] env[68964]: DEBUG nova.compute.manager [None req-8525e0d5-9df0-4466-b444-d8e0a48204f6 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: 0c909272-30a0-40b7-ad1d-90933925ff6f] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 915.567432] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8525e0d5-9df0-4466-b444-d8e0a48204f6 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "0c909272-30a0-40b7-ad1d-90933925ff6f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.786s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 915.622311] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3aa92de1-edd5-425b-aa5e-f1338d2904c4 tempest-ServerAddressesTestJSON-1793302926 tempest-ServerAddressesTestJSON-1793302926-project-member] Lock "cd99acb8-f78c-4c03-8c2e-2e9d50d18969" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.271s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 915.627249] env[68964]: DEBUG nova.compute.manager [None req-9f788767-3e9d-452b-8dc1-95d934f9f408 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 3005e937-65d2-4e41-8dd7-2fecaaa15365] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 915.653810] env[68964]: DEBUG nova.compute.manager [None req-9f788767-3e9d-452b-8dc1-95d934f9f408 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 3005e937-65d2-4e41-8dd7-2fecaaa15365] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 915.681833] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9f788767-3e9d-452b-8dc1-95d934f9f408 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "3005e937-65d2-4e41-8dd7-2fecaaa15365" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.638s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 915.698663] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 915.765408] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 915.765408] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 915.766689] env[68964]: INFO nova.compute.claims [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 916.166233] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f3c43b5-9d06-49c5-950a-4f2795496083 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.175474] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ab5ce76-4197-45c7-9e2a-1cc9197913e7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.209708] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df4bd60b-4661-42f8-8992-ddb8dd4c9af5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.218715] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-324461cc-6a7f-4847-b147-8ecdf735688b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.233619] env[68964]: DEBUG nova.compute.provider_tree [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 916.241937] env[68964]: DEBUG nova.scheduler.client.report [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 916.255386] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.491s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 916.255861] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 916.298575] env[68964]: DEBUG nova.compute.utils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 916.303021] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 916.303021] env[68964]: DEBUG nova.network.neutron [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 916.308953] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 916.380374] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 916.384482] env[68964]: DEBUG nova.policy [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'adccc7e764024769a4c2c9f14859bffe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '707860c1f4654a08995dbc255377e08b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 916.409113] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 916.409360] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 916.409512] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 916.409693] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 916.409880] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 916.409985] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 916.410497] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 916.410631] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 916.410811] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 916.411011] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 916.411211] env[68964]: DEBUG nova.virt.hardware [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 916.412505] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad42eaaa-ccc1-43f4-8372-728cf83230b4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.421668] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac629ec-dbbb-4820-8fe5-1f0f0d5fdc29 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.735409] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 916.735409] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 916.765046] env[68964]: DEBUG oslo_concurrency.lockutils [None req-598656b9-502b-4cef-9794-da74285cda21 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "66318915-69a7-4f3a-8aa2-377948732cc5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 916.765046] env[68964]: DEBUG oslo_concurrency.lockutils [None req-598656b9-502b-4cef-9794-da74285cda21 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "66318915-69a7-4f3a-8aa2-377948732cc5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 916.860585] env[68964]: DEBUG nova.network.neutron [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Successfully created port: 6b1a5a92-c0de-473f-96b2-040444aa40af {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 917.830279] env[68964]: DEBUG oslo_concurrency.lockutils [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquiring lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 917.841122] env[68964]: DEBUG nova.network.neutron [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Successfully updated port: 6b1a5a92-c0de-473f-96b2-040444aa40af {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 917.849025] env[68964]: DEBUG nova.compute.manager [req-805a96aa-65f0-457a-814f-51ca4fb2e64a req-e4abdca7-6bb1-4b72-b877-28f1529d2604 service nova] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Received event network-vif-plugged-6b1a5a92-c0de-473f-96b2-040444aa40af {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 917.849249] env[68964]: DEBUG oslo_concurrency.lockutils [req-805a96aa-65f0-457a-814f-51ca4fb2e64a req-e4abdca7-6bb1-4b72-b877-28f1529d2604 service nova] Acquiring lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 917.849444] env[68964]: DEBUG oslo_concurrency.lockutils [req-805a96aa-65f0-457a-814f-51ca4fb2e64a req-e4abdca7-6bb1-4b72-b877-28f1529d2604 service nova] Lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 917.849606] env[68964]: DEBUG oslo_concurrency.lockutils [req-805a96aa-65f0-457a-814f-51ca4fb2e64a req-e4abdca7-6bb1-4b72-b877-28f1529d2604 service nova] Lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 917.849763] env[68964]: DEBUG nova.compute.manager [req-805a96aa-65f0-457a-814f-51ca4fb2e64a req-e4abdca7-6bb1-4b72-b877-28f1529d2604 service nova] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] No waiting events found dispatching network-vif-plugged-6b1a5a92-c0de-473f-96b2-040444aa40af {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 917.849913] env[68964]: WARNING nova.compute.manager [req-805a96aa-65f0-457a-814f-51ca4fb2e64a req-e4abdca7-6bb1-4b72-b877-28f1529d2604 service nova] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Received unexpected event network-vif-plugged-6b1a5a92-c0de-473f-96b2-040444aa40af for instance with vm_state building and task_state deleting. [ 917.860389] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquiring lock "refresh_cache-ace6f24b-1fd4-4db2-a232-ac80b1f810d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 917.860389] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquired lock "refresh_cache-ace6f24b-1fd4-4db2-a232-ac80b1f810d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 917.860780] env[68964]: DEBUG nova.network.neutron [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 917.916588] env[68964]: DEBUG nova.network.neutron [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 918.345766] env[68964]: DEBUG nova.network.neutron [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Updating instance_info_cache with network_info: [{"id": "6b1a5a92-c0de-473f-96b2-040444aa40af", "address": "fa:16:3e:0c:98:eb", "network": {"id": "58d7f676-b330-4999-9898-0348cfefeb27", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1222318342-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "707860c1f4654a08995dbc255377e08b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b1a5a92-c0", "ovs_interfaceid": "6b1a5a92-c0de-473f-96b2-040444aa40af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 918.357451] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Releasing lock "refresh_cache-ace6f24b-1fd4-4db2-a232-ac80b1f810d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 918.357806] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Instance network_info: |[{"id": "6b1a5a92-c0de-473f-96b2-040444aa40af", "address": "fa:16:3e:0c:98:eb", "network": {"id": "58d7f676-b330-4999-9898-0348cfefeb27", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1222318342-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "707860c1f4654a08995dbc255377e08b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b1a5a92-c0", "ovs_interfaceid": "6b1a5a92-c0de-473f-96b2-040444aa40af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 918.358158] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0c:98:eb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0f096917-a0cf-4add-a9d2-23ca1c723b3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6b1a5a92-c0de-473f-96b2-040444aa40af', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 918.365438] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Creating folder: Project (707860c1f4654a08995dbc255377e08b). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 918.366141] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d61b3344-943a-40b5-97b2-5dfe0b34d8dc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.376011] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Created folder: Project (707860c1f4654a08995dbc255377e08b) in parent group-v684465. [ 918.376208] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Creating folder: Instances. Parent ref: group-v684522. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 918.376436] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-36fb731e-7e97-454f-b8fe-869d3721206c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.385343] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Created folder: Instances in parent group-v684522. [ 918.385448] env[68964]: DEBUG oslo.service.loopingcall [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 918.385639] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 918.385907] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f44bb886-bed7-40db-bf84-999bfd19956e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.404760] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 918.404760] env[68964]: value = "task-3431584" [ 918.404760] env[68964]: _type = "Task" [ 918.404760] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 918.412661] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431584, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 918.917144] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431584, 'name': CreateVM_Task, 'duration_secs': 0.319267} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 918.917416] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 918.918441] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 918.918441] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 918.918807] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 918.918856] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c9b2924f-8d86-4c91-aa8d-cab4a1cc15fd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 918.923927] env[68964]: DEBUG oslo_vmware.api [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Waiting for the task: (returnval){ [ 918.923927] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]526a7b71-f045-d593-de18-51d8cd02c9e9" [ 918.923927] env[68964]: _type = "Task" [ 918.923927] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 918.934679] env[68964]: DEBUG oslo_vmware.api [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]526a7b71-f045-d593-de18-51d8cd02c9e9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 919.440542] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 919.440853] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 919.441094] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 919.849431] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ef551ca1-d475-435a-b88a-a49a772cb711 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] Acquiring lock "5c076ffe-9532-4d57-b044-a74a48cb147d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 919.850283] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ef551ca1-d475-435a-b88a-a49a772cb711 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] Lock "5c076ffe-9532-4d57-b044-a74a48cb147d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 920.408260] env[68964]: DEBUG nova.compute.manager [req-661b79d9-72d4-4904-b951-d71e10b7aa64 req-9231a1d4-c1e3-4c6f-8484-e48d1b19483c service nova] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Received event network-changed-6b1a5a92-c0de-473f-96b2-040444aa40af {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 920.408686] env[68964]: DEBUG nova.compute.manager [req-661b79d9-72d4-4904-b951-d71e10b7aa64 req-9231a1d4-c1e3-4c6f-8484-e48d1b19483c service nova] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Refreshing instance network info cache due to event network-changed-6b1a5a92-c0de-473f-96b2-040444aa40af. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 920.408902] env[68964]: DEBUG oslo_concurrency.lockutils [req-661b79d9-72d4-4904-b951-d71e10b7aa64 req-9231a1d4-c1e3-4c6f-8484-e48d1b19483c service nova] Acquiring lock "refresh_cache-ace6f24b-1fd4-4db2-a232-ac80b1f810d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 920.409118] env[68964]: DEBUG oslo_concurrency.lockutils [req-661b79d9-72d4-4904-b951-d71e10b7aa64 req-9231a1d4-c1e3-4c6f-8484-e48d1b19483c service nova] Acquired lock "refresh_cache-ace6f24b-1fd4-4db2-a232-ac80b1f810d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 920.409329] env[68964]: DEBUG nova.network.neutron [req-661b79d9-72d4-4904-b951-d71e10b7aa64 req-9231a1d4-c1e3-4c6f-8484-e48d1b19483c service nova] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Refreshing network info cache for port 6b1a5a92-c0de-473f-96b2-040444aa40af {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 921.098252] env[68964]: DEBUG nova.network.neutron [req-661b79d9-72d4-4904-b951-d71e10b7aa64 req-9231a1d4-c1e3-4c6f-8484-e48d1b19483c service nova] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Updated VIF entry in instance network info cache for port 6b1a5a92-c0de-473f-96b2-040444aa40af. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 921.100761] env[68964]: DEBUG nova.network.neutron [req-661b79d9-72d4-4904-b951-d71e10b7aa64 req-9231a1d4-c1e3-4c6f-8484-e48d1b19483c service nova] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Updating instance_info_cache with network_info: [{"id": "6b1a5a92-c0de-473f-96b2-040444aa40af", "address": "fa:16:3e:0c:98:eb", "network": {"id": "58d7f676-b330-4999-9898-0348cfefeb27", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1222318342-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "707860c1f4654a08995dbc255377e08b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f096917-a0cf-4add-a9d2-23ca1c723b3b", "external-id": "nsx-vlan-transportzone-894", "segmentation_id": 894, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b1a5a92-c0", "ovs_interfaceid": "6b1a5a92-c0de-473f-96b2-040444aa40af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 921.119507] env[68964]: DEBUG oslo_concurrency.lockutils [req-661b79d9-72d4-4904-b951-d71e10b7aa64 req-9231a1d4-c1e3-4c6f-8484-e48d1b19483c service nova] Releasing lock "refresh_cache-ace6f24b-1fd4-4db2-a232-ac80b1f810d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 923.320431] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a43e778a-5369-429c-85e3-1e2ecb4d0013 tempest-ServerAddressesNegativeTestJSON-1242941731 tempest-ServerAddressesNegativeTestJSON-1242941731-project-member] Acquiring lock "41317213-a0f2-42fc-9e44-dfe83d27a811" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 923.320749] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a43e778a-5369-429c-85e3-1e2ecb4d0013 tempest-ServerAddressesNegativeTestJSON-1242941731 tempest-ServerAddressesNegativeTestJSON-1242941731-project-member] Lock "41317213-a0f2-42fc-9e44-dfe83d27a811" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 928.143204] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8240fc55-8211-4a42-aa95-a0c9eef58693 tempest-ServerShowV257Test-1966615260 tempest-ServerShowV257Test-1966615260-project-member] Acquiring lock "864ec33b-2840-4ed3-b0b6-2ef062141705" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 928.143580] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8240fc55-8211-4a42-aa95-a0c9eef58693 tempest-ServerShowV257Test-1966615260 tempest-ServerShowV257Test-1966615260-project-member] Lock "864ec33b-2840-4ed3-b0b6-2ef062141705" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.721633] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "57dbb1fb-8a62-4127-ac5c-4d7bf1fa2a52" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.722288] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "57dbb1fb-8a62-4127-ac5c-4d7bf1fa2a52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.756219] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "c7853bb3-fa53-4911-818f-e03245ad3a0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.756219] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "c7853bb3-fa53-4911-818f-e03245ad3a0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 938.921764] env[68964]: DEBUG oslo_concurrency.lockutils [None req-998facda-bbd0-4953-b652-1aea56ea8704 tempest-ServerActionsTestOtherB-1759788798 tempest-ServerActionsTestOtherB-1759788798-project-member] Acquiring lock "19f90c65-2865-4fa7-b647-f69fd217e1e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 938.922091] env[68964]: DEBUG oslo_concurrency.lockutils [None req-998facda-bbd0-4953-b652-1aea56ea8704 tempest-ServerActionsTestOtherB-1759788798 tempest-ServerActionsTestOtherB-1759788798-project-member] Lock "19f90c65-2865-4fa7-b647-f69fd217e1e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 949.962719] env[68964]: WARNING oslo_vmware.rw_handles [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 949.962719] env[68964]: ERROR oslo_vmware.rw_handles [ 949.962719] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/c23d6124-cc57-4d95-975a-b190ce222580/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 949.964602] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 949.964867] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Copying Virtual Disk [datastore2] vmware_temp/c23d6124-cc57-4d95-975a-b190ce222580/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/c23d6124-cc57-4d95-975a-b190ce222580/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 949.965179] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1d5cba2c-d95f-4767-82b6-d61b8eddbb84 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.973188] env[68964]: DEBUG oslo_vmware.api [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Waiting for the task: (returnval){ [ 949.973188] env[68964]: value = "task-3431585" [ 949.973188] env[68964]: _type = "Task" [ 949.973188] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 949.980894] env[68964]: DEBUG oslo_vmware.api [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Task: {'id': task-3431585, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 950.482540] env[68964]: DEBUG oslo_vmware.exceptions [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 950.482852] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 950.483411] env[68964]: ERROR nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 950.483411] env[68964]: Faults: ['InvalidArgument'] [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Traceback (most recent call last): [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] yield resources [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] self.driver.spawn(context, instance, image_meta, [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] self._fetch_image_if_missing(context, vi) [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] image_cache(vi, tmp_image_ds_loc) [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] vm_util.copy_virtual_disk( [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] session._wait_for_task(vmdk_copy_task) [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] return self.wait_for_task(task_ref) [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] return evt.wait() [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] result = hub.switch() [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] return self.greenlet.switch() [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] self.f(*self.args, **self.kw) [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] raise exceptions.translate_fault(task_info.error) [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Faults: ['InvalidArgument'] [ 950.483411] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] [ 950.484423] env[68964]: INFO nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Terminating instance [ 950.485274] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 950.485479] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 950.485708] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d1ee51fc-b239-447d-b72a-505d7e077765 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.488136] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 950.488337] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 950.489138] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84a06cd9-4432-4fa2-a042-d28c4678878f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.496076] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 950.496297] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ab2957ee-514c-46f8-bf85-718f7bac422c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.498503] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 950.498699] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 950.499653] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-12f60881-b7df-4f81-8cb3-b241b550613d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.504562] env[68964]: DEBUG oslo_vmware.api [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Waiting for the task: (returnval){ [ 950.504562] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5298b6e9-1428-ad24-fd78-7533cab884ec" [ 950.504562] env[68964]: _type = "Task" [ 950.504562] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 950.511879] env[68964]: DEBUG oslo_vmware.api [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5298b6e9-1428-ad24-fd78-7533cab884ec, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 950.567970] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 950.567970] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 950.568170] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Deleting the datastore file [datastore2] 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 950.568425] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b5d9b119-f11c-4543-883d-6990767800e4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.574514] env[68964]: DEBUG oslo_vmware.api [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Waiting for the task: (returnval){ [ 950.574514] env[68964]: value = "task-3431587" [ 950.574514] env[68964]: _type = "Task" [ 950.574514] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 950.581737] env[68964]: DEBUG oslo_vmware.api [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Task: {'id': task-3431587, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 951.016105] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 951.016408] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Creating directory with path [datastore2] vmware_temp/b196a088-0055-4ad2-8847-3e8ccb41a4d3/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 951.016465] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-881d6b3d-2f77-4366-94ab-d78979a14ded {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.029105] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Created directory with path [datastore2] vmware_temp/b196a088-0055-4ad2-8847-3e8ccb41a4d3/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 951.029105] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Fetch image to [datastore2] vmware_temp/b196a088-0055-4ad2-8847-3e8ccb41a4d3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 951.029105] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/b196a088-0055-4ad2-8847-3e8ccb41a4d3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 951.029334] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-946e64bf-4c0a-42be-adde-4b5e6ec66518 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.035934] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dd3965c-f6ec-49a4-a010-d1d76df1f9d0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.044926] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f24a9e44-5e4c-40ae-acaa-fb39e9c12cf0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.078312] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9711826a-dfe6-40b3-a17f-d825c3921958 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.085190] env[68964]: DEBUG oslo_vmware.api [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Task: {'id': task-3431587, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078878} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 951.086725] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 951.086966] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 951.087094] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 951.087269] env[68964]: INFO nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Took 0.60 seconds to destroy the instance on the hypervisor. [ 951.089293] env[68964]: DEBUG nova.compute.claims [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 951.089464] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 951.089670] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 951.092080] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7ae85c80-ab6e-4098-993f-c2562105869d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.113839] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 951.223321] env[68964]: DEBUG oslo_vmware.rw_handles [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b196a088-0055-4ad2-8847-3e8ccb41a4d3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 951.287286] env[68964]: DEBUG oslo_vmware.rw_handles [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 951.287479] env[68964]: DEBUG oslo_vmware.rw_handles [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b196a088-0055-4ad2-8847-3e8ccb41a4d3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 951.523097] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71c18908-39ea-4c7d-8e30-8f7b63394356 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.530648] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a14f8ba-456a-4f8b-8abf-eddacfcdd31d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.561606] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48bbb83f-9869-492e-a4f7-02de6047fd6b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.568941] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fd78052-8cc4-42de-b0c8-0ca241e49194 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.581953] env[68964]: DEBUG nova.compute.provider_tree [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 951.593563] env[68964]: DEBUG nova.scheduler.client.report [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 951.607542] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.518s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 951.608081] env[68964]: ERROR nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 951.608081] env[68964]: Faults: ['InvalidArgument'] [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Traceback (most recent call last): [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] self.driver.spawn(context, instance, image_meta, [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] self._fetch_image_if_missing(context, vi) [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] image_cache(vi, tmp_image_ds_loc) [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] vm_util.copy_virtual_disk( [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] session._wait_for_task(vmdk_copy_task) [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] return self.wait_for_task(task_ref) [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] return evt.wait() [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] result = hub.switch() [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] return self.greenlet.switch() [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] self.f(*self.args, **self.kw) [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] raise exceptions.translate_fault(task_info.error) [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Faults: ['InvalidArgument'] [ 951.608081] env[68964]: ERROR nova.compute.manager [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] [ 951.608991] env[68964]: DEBUG nova.compute.utils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 951.610143] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Build of instance 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf was re-scheduled: A specified parameter was not correct: fileType [ 951.610143] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 951.610527] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 951.610706] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 951.610859] env[68964]: DEBUG nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 951.611045] env[68964]: DEBUG nova.network.neutron [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 952.160311] env[68964]: DEBUG nova.network.neutron [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 952.173132] env[68964]: INFO nova.compute.manager [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Took 0.56 seconds to deallocate network for instance. [ 952.274306] env[68964]: INFO nova.scheduler.client.report [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Deleted allocations for instance 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf [ 952.294545] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7c61bb9-4bfc-4a84-838a-90355a26c8d4 tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 291.093s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.295697] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 93.753s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 952.295931] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Acquiring lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 952.296149] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 952.296313] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.302551] env[68964]: INFO nova.compute.manager [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Terminating instance [ 952.308447] env[68964]: DEBUG nova.compute.manager [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 952.311601] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 952.311601] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b12729e3-dee4-40ae-bdd1-28c64883a308 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 952.314055] env[68964]: DEBUG nova.compute.manager [None req-ada3d0bd-f797-4749-96da-db4046c9ae04 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] [instance: 7c21c92e-16ed-4e2c-90d5-9391b1eeb703] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 952.325018] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc138853-f63d-451d-a847-078745b9e161 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 952.341315] env[68964]: DEBUG nova.compute.manager [None req-ada3d0bd-f797-4749-96da-db4046c9ae04 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] [instance: 7c21c92e-16ed-4e2c-90d5-9391b1eeb703] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 952.357047] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf could not be found. [ 952.357359] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 952.357538] env[68964]: INFO nova.compute.manager [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Took 0.05 seconds to destroy the instance on the hypervisor. [ 952.357818] env[68964]: DEBUG oslo.service.loopingcall [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 952.358104] env[68964]: DEBUG nova.compute.manager [-] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 952.358241] env[68964]: DEBUG nova.network.neutron [-] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 952.378742] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ada3d0bd-f797-4749-96da-db4046c9ae04 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] Lock "7c21c92e-16ed-4e2c-90d5-9391b1eeb703" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.013s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.391031] env[68964]: DEBUG nova.compute.manager [None req-efda74b2-3b38-4e8f-bdf0-bd6d7b1206ce tempest-AttachInterfacesUnderV243Test-1445261461 tempest-AttachInterfacesUnderV243Test-1445261461-project-member] [instance: 749fc36c-c3de-4762-bae7-515dec3c7377] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 952.397553] env[68964]: DEBUG nova.network.neutron [-] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 952.405446] env[68964]: INFO nova.compute.manager [-] [instance: 0ea7c687-a9a6-4531-b9a9-4cf3e74940bf] Took 0.05 seconds to deallocate network for instance. [ 952.420858] env[68964]: DEBUG nova.compute.manager [None req-efda74b2-3b38-4e8f-bdf0-bd6d7b1206ce tempest-AttachInterfacesUnderV243Test-1445261461 tempest-AttachInterfacesUnderV243Test-1445261461-project-member] [instance: 749fc36c-c3de-4762-bae7-515dec3c7377] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 952.441325] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efda74b2-3b38-4e8f-bdf0-bd6d7b1206ce tempest-AttachInterfacesUnderV243Test-1445261461 tempest-AttachInterfacesUnderV243Test-1445261461-project-member] Lock "749fc36c-c3de-4762-bae7-515dec3c7377" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.597s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.452692] env[68964]: DEBUG nova.compute.manager [None req-cdc17b90-87c8-4f1b-b229-fc98d44ab136 tempest-ServerActionsV293TestJSON-572515950 tempest-ServerActionsV293TestJSON-572515950-project-member] [instance: 04e18d39-9cf6-4c0e-ae33-29e955827571] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 952.482072] env[68964]: DEBUG nova.compute.manager [None req-cdc17b90-87c8-4f1b-b229-fc98d44ab136 tempest-ServerActionsV293TestJSON-572515950 tempest-ServerActionsV293TestJSON-572515950-project-member] [instance: 04e18d39-9cf6-4c0e-ae33-29e955827571] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 952.499182] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cb817920-2cf7-42fa-9b1d-589a0db0958c tempest-ServerDiagnosticsTest-2094767189 tempest-ServerDiagnosticsTest-2094767189-project-member] Lock "0ea7c687-a9a6-4531-b9a9-4cf3e74940bf" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.203s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.503737] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cdc17b90-87c8-4f1b-b229-fc98d44ab136 tempest-ServerActionsV293TestJSON-572515950 tempest-ServerActionsV293TestJSON-572515950-project-member] Lock "04e18d39-9cf6-4c0e-ae33-29e955827571" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.027s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.517503] env[68964]: DEBUG nova.compute.manager [None req-f45a791f-d01e-45aa-957f-95b8a08e9b19 tempest-ImagesNegativeTestJSON-484771123 tempest-ImagesNegativeTestJSON-484771123-project-member] [instance: 92c1d7af-79e0-4cd9-a7e5-a969b4843778] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 952.539745] env[68964]: DEBUG nova.compute.manager [None req-f45a791f-d01e-45aa-957f-95b8a08e9b19 tempest-ImagesNegativeTestJSON-484771123 tempest-ImagesNegativeTestJSON-484771123-project-member] [instance: 92c1d7af-79e0-4cd9-a7e5-a969b4843778] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 952.559550] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f45a791f-d01e-45aa-957f-95b8a08e9b19 tempest-ImagesNegativeTestJSON-484771123 tempest-ImagesNegativeTestJSON-484771123-project-member] Lock "92c1d7af-79e0-4cd9-a7e5-a969b4843778" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.419s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.570623] env[68964]: DEBUG nova.compute.manager [None req-f40ee082-f03a-44f5-bad3-1c2692051779 tempest-ServerActionsTestOtherA-473204481 tempest-ServerActionsTestOtherA-473204481-project-member] [instance: 4726af42-5678-4b56-8675-76e30156feaa] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 952.598585] env[68964]: DEBUG nova.compute.manager [None req-f40ee082-f03a-44f5-bad3-1c2692051779 tempest-ServerActionsTestOtherA-473204481 tempest-ServerActionsTestOtherA-473204481-project-member] [instance: 4726af42-5678-4b56-8675-76e30156feaa] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 952.622934] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f40ee082-f03a-44f5-bad3-1c2692051779 tempest-ServerActionsTestOtherA-473204481 tempest-ServerActionsTestOtherA-473204481-project-member] Lock "4726af42-5678-4b56-8675-76e30156feaa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.599s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.632745] env[68964]: DEBUG nova.compute.manager [None req-3ea3d12d-8d1c-45c4-ae3c-3e452436cd1c tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] [instance: eb007e0d-124f-4ef6-85d7-c68b310e8b9f] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 952.663815] env[68964]: DEBUG nova.compute.manager [None req-3ea3d12d-8d1c-45c4-ae3c-3e452436cd1c tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] [instance: eb007e0d-124f-4ef6-85d7-c68b310e8b9f] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 952.696598] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3ea3d12d-8d1c-45c4-ae3c-3e452436cd1c tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] Lock "eb007e0d-124f-4ef6-85d7-c68b310e8b9f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.315s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.707230] env[68964]: DEBUG nova.compute.manager [None req-54b22aed-4387-4158-86c3-736a2e0cf3ec tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] [instance: 58afa2a4-da8c-4b32-9c76-587d082de444] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 952.739423] env[68964]: DEBUG nova.compute.manager [None req-54b22aed-4387-4158-86c3-736a2e0cf3ec tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] [instance: 58afa2a4-da8c-4b32-9c76-587d082de444] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 952.761903] env[68964]: DEBUG oslo_concurrency.lockutils [None req-54b22aed-4387-4158-86c3-736a2e0cf3ec tempest-ListImageFiltersTestJSON-902581978 tempest-ListImageFiltersTestJSON-902581978-project-member] Lock "58afa2a4-da8c-4b32-9c76-587d082de444" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.960s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.772665] env[68964]: DEBUG nova.compute.manager [None req-df9132b3-9f33-4942-8b4e-b722eb1320d5 tempest-ImagesOneServerTestJSON-1548653576 tempest-ImagesOneServerTestJSON-1548653576-project-member] [instance: 722f8bf7-1634-4190-9cc0-49b2a28c367e] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 952.807844] env[68964]: DEBUG nova.compute.manager [None req-df9132b3-9f33-4942-8b4e-b722eb1320d5 tempest-ImagesOneServerTestJSON-1548653576 tempest-ImagesOneServerTestJSON-1548653576-project-member] [instance: 722f8bf7-1634-4190-9cc0-49b2a28c367e] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 952.831463] env[68964]: DEBUG oslo_concurrency.lockutils [None req-df9132b3-9f33-4942-8b4e-b722eb1320d5 tempest-ImagesOneServerTestJSON-1548653576 tempest-ImagesOneServerTestJSON-1548653576-project-member] Lock "722f8bf7-1634-4190-9cc0-49b2a28c367e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.443s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 952.843735] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 952.906379] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 952.906633] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 952.908159] env[68964]: INFO nova.compute.claims [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 953.366322] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54d0b2de-8ab5-4829-bfdf-77572aa42505 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 953.374110] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ec5d176-a6fc-4d5a-9605-c3f3411b8a4e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 953.406229] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a9cad1d-65d0-4f9f-84a4-c666bbd88c85 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 953.415489] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84316a05-3b88-4fcb-87dd-20c5b8ac7c16 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 953.431015] env[68964]: DEBUG nova.compute.provider_tree [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 953.467288] env[68964]: DEBUG nova.scheduler.client.report [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 953.490380] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.584s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 953.490920] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 953.529034] env[68964]: DEBUG nova.compute.utils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 953.530489] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 953.530657] env[68964]: DEBUG nova.network.neutron [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 953.549115] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 953.623127] env[68964]: DEBUG nova.policy [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56368994129d4801acdebda8c8b8181a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '64fea1d1b3994eebaeba6a57009973ba', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 953.643209] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 953.676843] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 953.677328] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 953.677511] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 953.677812] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 953.678052] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 953.678252] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 953.678540] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 953.678756] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 953.678955] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 953.679219] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 953.679517] env[68964]: DEBUG nova.virt.hardware [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 953.680990] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7983732c-be1e-4ca0-b787-a357b1436dce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 953.691149] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa2ce0bc-8d43-4f82-948a-bdd17bd7b323 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 954.051118] env[68964]: DEBUG nova.network.neutron [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Successfully created port: 15a0b4f7-1583-48ba-82d3-ca29e720ddab {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 955.233476] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cc24d389-ee17-4a7f-9b89-78a15f8dd133 tempest-TenantUsagesTestJSON-431596998 tempest-TenantUsagesTestJSON-431596998-project-member] Acquiring lock "c1efc344-848b-4a98-a20a-57ebdfb5ac8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 955.233789] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cc24d389-ee17-4a7f-9b89-78a15f8dd133 tempest-TenantUsagesTestJSON-431596998 tempest-TenantUsagesTestJSON-431596998-project-member] Lock "c1efc344-848b-4a98-a20a-57ebdfb5ac8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 955.307841] env[68964]: DEBUG nova.compute.manager [req-4bd5edf5-a0b1-42c2-854a-dab6edc64e71 req-f8e6b875-ad4f-4007-b6e2-abfb68fbadf0 service nova] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Received event network-vif-plugged-15a0b4f7-1583-48ba-82d3-ca29e720ddab {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 955.308069] env[68964]: DEBUG oslo_concurrency.lockutils [req-4bd5edf5-a0b1-42c2-854a-dab6edc64e71 req-f8e6b875-ad4f-4007-b6e2-abfb68fbadf0 service nova] Acquiring lock "3770333e-4721-424d-ac86-2291c002e99a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 955.308278] env[68964]: DEBUG oslo_concurrency.lockutils [req-4bd5edf5-a0b1-42c2-854a-dab6edc64e71 req-f8e6b875-ad4f-4007-b6e2-abfb68fbadf0 service nova] Lock "3770333e-4721-424d-ac86-2291c002e99a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 955.309492] env[68964]: DEBUG oslo_concurrency.lockutils [req-4bd5edf5-a0b1-42c2-854a-dab6edc64e71 req-f8e6b875-ad4f-4007-b6e2-abfb68fbadf0 service nova] Lock "3770333e-4721-424d-ac86-2291c002e99a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 955.309492] env[68964]: DEBUG nova.compute.manager [req-4bd5edf5-a0b1-42c2-854a-dab6edc64e71 req-f8e6b875-ad4f-4007-b6e2-abfb68fbadf0 service nova] [instance: 3770333e-4721-424d-ac86-2291c002e99a] No waiting events found dispatching network-vif-plugged-15a0b4f7-1583-48ba-82d3-ca29e720ddab {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 955.309492] env[68964]: WARNING nova.compute.manager [req-4bd5edf5-a0b1-42c2-854a-dab6edc64e71 req-f8e6b875-ad4f-4007-b6e2-abfb68fbadf0 service nova] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Received unexpected event network-vif-plugged-15a0b4f7-1583-48ba-82d3-ca29e720ddab for instance with vm_state building and task_state spawning. [ 955.341826] env[68964]: DEBUG nova.network.neutron [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Successfully updated port: 15a0b4f7-1583-48ba-82d3-ca29e720ddab {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 955.355662] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquiring lock "refresh_cache-3770333e-4721-424d-ac86-2291c002e99a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 955.356866] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquired lock "refresh_cache-3770333e-4721-424d-ac86-2291c002e99a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 955.356866] env[68964]: DEBUG nova.network.neutron [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 955.463218] env[68964]: DEBUG nova.network.neutron [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 955.760625] env[68964]: DEBUG nova.network.neutron [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Updating instance_info_cache with network_info: [{"id": "15a0b4f7-1583-48ba-82d3-ca29e720ddab", "address": "fa:16:3e:17:e6:83", "network": {"id": "b1523e6a-6a7b-4fc9-8420-df9737c1409a", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1760857168-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "64fea1d1b3994eebaeba6a57009973ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8abee039-d93e-48a7-8911-6416a3e1ff30", "external-id": "nsx-vlan-transportzone-654", "segmentation_id": 654, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap15a0b4f7-15", "ovs_interfaceid": "15a0b4f7-1583-48ba-82d3-ca29e720ddab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 955.775312] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Releasing lock "refresh_cache-3770333e-4721-424d-ac86-2291c002e99a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 955.775632] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Instance network_info: |[{"id": "15a0b4f7-1583-48ba-82d3-ca29e720ddab", "address": "fa:16:3e:17:e6:83", "network": {"id": "b1523e6a-6a7b-4fc9-8420-df9737c1409a", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1760857168-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "64fea1d1b3994eebaeba6a57009973ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8abee039-d93e-48a7-8911-6416a3e1ff30", "external-id": "nsx-vlan-transportzone-654", "segmentation_id": 654, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap15a0b4f7-15", "ovs_interfaceid": "15a0b4f7-1583-48ba-82d3-ca29e720ddab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 955.776027] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:17:e6:83', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8abee039-d93e-48a7-8911-6416a3e1ff30', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '15a0b4f7-1583-48ba-82d3-ca29e720ddab', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 955.783652] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Creating folder: Project (64fea1d1b3994eebaeba6a57009973ba). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 955.784233] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-74dad74d-a91d-4340-b0e7-0ec585a1d567 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 955.794089] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Created folder: Project (64fea1d1b3994eebaeba6a57009973ba) in parent group-v684465. [ 955.794287] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Creating folder: Instances. Parent ref: group-v684525. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 955.794517] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-90af7d36-bc51-43d0-8e8b-d03ad2ab2839 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 955.803067] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Created folder: Instances in parent group-v684525. [ 955.803507] env[68964]: DEBUG oslo.service.loopingcall [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 955.803612] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 955.804207] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ca722942-90d0-4891-bbcc-2587ef43ab8a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 955.823042] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 955.823042] env[68964]: value = "task-3431590" [ 955.823042] env[68964]: _type = "Task" [ 955.823042] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 955.836831] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431590, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 956.333638] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431590, 'name': CreateVM_Task, 'duration_secs': 0.354762} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 956.333880] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 956.335029] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 956.335029] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 956.335029] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 956.335218] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5fbbf168-c683-4f1e-b8b3-7f16c47e154a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 956.339753] env[68964]: DEBUG oslo_vmware.api [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Waiting for the task: (returnval){ [ 956.339753] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525a78a0-545a-6cc6-f691-b27da7f41d31" [ 956.339753] env[68964]: _type = "Task" [ 956.339753] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 956.351020] env[68964]: DEBUG oslo_vmware.api [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525a78a0-545a-6cc6-f691-b27da7f41d31, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 956.852483] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 956.852771] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 956.853369] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 957.334169] env[68964]: DEBUG nova.compute.manager [req-e7e255f3-56ef-43e9-91c0-95ed9c33a3f2 req-5123f629-7235-49b0-84f2-f4bee7f58a7b service nova] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Received event network-changed-15a0b4f7-1583-48ba-82d3-ca29e720ddab {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 957.334404] env[68964]: DEBUG nova.compute.manager [req-e7e255f3-56ef-43e9-91c0-95ed9c33a3f2 req-5123f629-7235-49b0-84f2-f4bee7f58a7b service nova] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Refreshing instance network info cache due to event network-changed-15a0b4f7-1583-48ba-82d3-ca29e720ddab. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 957.334571] env[68964]: DEBUG oslo_concurrency.lockutils [req-e7e255f3-56ef-43e9-91c0-95ed9c33a3f2 req-5123f629-7235-49b0-84f2-f4bee7f58a7b service nova] Acquiring lock "refresh_cache-3770333e-4721-424d-ac86-2291c002e99a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 957.334714] env[68964]: DEBUG oslo_concurrency.lockutils [req-e7e255f3-56ef-43e9-91c0-95ed9c33a3f2 req-5123f629-7235-49b0-84f2-f4bee7f58a7b service nova] Acquired lock "refresh_cache-3770333e-4721-424d-ac86-2291c002e99a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 957.334875] env[68964]: DEBUG nova.network.neutron [req-e7e255f3-56ef-43e9-91c0-95ed9c33a3f2 req-5123f629-7235-49b0-84f2-f4bee7f58a7b service nova] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Refreshing network info cache for port 15a0b4f7-1583-48ba-82d3-ca29e720ddab {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 957.609963] env[68964]: DEBUG nova.network.neutron [req-e7e255f3-56ef-43e9-91c0-95ed9c33a3f2 req-5123f629-7235-49b0-84f2-f4bee7f58a7b service nova] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Updated VIF entry in instance network info cache for port 15a0b4f7-1583-48ba-82d3-ca29e720ddab. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 957.610331] env[68964]: DEBUG nova.network.neutron [req-e7e255f3-56ef-43e9-91c0-95ed9c33a3f2 req-5123f629-7235-49b0-84f2-f4bee7f58a7b service nova] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Updating instance_info_cache with network_info: [{"id": "15a0b4f7-1583-48ba-82d3-ca29e720ddab", "address": "fa:16:3e:17:e6:83", "network": {"id": "b1523e6a-6a7b-4fc9-8420-df9737c1409a", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1760857168-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "64fea1d1b3994eebaeba6a57009973ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8abee039-d93e-48a7-8911-6416a3e1ff30", "external-id": "nsx-vlan-transportzone-654", "segmentation_id": 654, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap15a0b4f7-15", "ovs_interfaceid": "15a0b4f7-1583-48ba-82d3-ca29e720ddab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 957.621474] env[68964]: DEBUG oslo_concurrency.lockutils [req-e7e255f3-56ef-43e9-91c0-95ed9c33a3f2 req-5123f629-7235-49b0-84f2-f4bee7f58a7b service nova] Releasing lock "refresh_cache-3770333e-4721-424d-ac86-2291c002e99a" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 961.723731] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 961.724704] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 961.942620] env[68964]: WARNING oslo_vmware.rw_handles [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 961.942620] env[68964]: ERROR oslo_vmware.rw_handles [ 961.943246] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/6a8877ac-a1e2-4c33-a063-d1c47af8db8c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 961.944970] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 961.945218] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Copying Virtual Disk [datastore1] vmware_temp/6a8877ac-a1e2-4c33-a063-d1c47af8db8c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/6a8877ac-a1e2-4c33-a063-d1c47af8db8c/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 961.945513] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c900d9cf-4bd0-40fe-bad6-7857dee331b7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 961.954063] env[68964]: DEBUG oslo_vmware.api [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Waiting for the task: (returnval){ [ 961.954063] env[68964]: value = "task-3431591" [ 961.954063] env[68964]: _type = "Task" [ 961.954063] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 961.962218] env[68964]: DEBUG oslo_vmware.api [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Task: {'id': task-3431591, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 962.464116] env[68964]: DEBUG oslo_vmware.exceptions [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 962.464416] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 962.464958] env[68964]: ERROR nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 962.464958] env[68964]: Faults: ['InvalidArgument'] [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] Traceback (most recent call last): [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] yield resources [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] self.driver.spawn(context, instance, image_meta, [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] self._fetch_image_if_missing(context, vi) [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] image_cache(vi, tmp_image_ds_loc) [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] vm_util.copy_virtual_disk( [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] session._wait_for_task(vmdk_copy_task) [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] return self.wait_for_task(task_ref) [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] return evt.wait() [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] result = hub.switch() [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] return self.greenlet.switch() [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] self.f(*self.args, **self.kw) [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] raise exceptions.translate_fault(task_info.error) [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] Faults: ['InvalidArgument'] [ 962.464958] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] [ 962.465995] env[68964]: INFO nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Terminating instance [ 962.466841] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 962.467061] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 962.467666] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 962.467859] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 962.468097] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-34e6a59e-5df9-44e4-adc1-4d4e1b7639cd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.470442] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c233c329-d6fc-4b86-92fe-73dc9311fb00 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.477267] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 962.477530] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-00cbf4cd-5e3e-4330-a120-c5019630e764 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.479810] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 962.479982] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 962.480982] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bb585d8e-180f-4dc3-8244-b37b0f577e8c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.485557] env[68964]: DEBUG oslo_vmware.api [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for the task: (returnval){ [ 962.485557] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52e8d64d-3611-b688-900d-855e80eef777" [ 962.485557] env[68964]: _type = "Task" [ 962.485557] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 962.493756] env[68964]: DEBUG oslo_vmware.api [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52e8d64d-3611-b688-900d-855e80eef777, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 962.539172] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 962.539393] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 962.539634] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Deleting the datastore file [datastore1] 323acb55-859a-4545-a046-1934cf98be6d {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 962.539900] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7292f6fd-f179-44b5-b153-8b40518e2612 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.546286] env[68964]: DEBUG oslo_vmware.api [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Waiting for the task: (returnval){ [ 962.546286] env[68964]: value = "task-3431593" [ 962.546286] env[68964]: _type = "Task" [ 962.546286] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 962.554463] env[68964]: DEBUG oslo_vmware.api [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Task: {'id': task-3431593, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 962.720587] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 962.746180] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 962.746506] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 962.746506] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 962.999226] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 962.999538] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Creating directory with path [datastore1] vmware_temp/47884837-739f-4845-939a-9a8d213b86e6/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 962.999830] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5cbdc1dd-7895-420b-8ebd-6a49a1e26394 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.011623] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Created directory with path [datastore1] vmware_temp/47884837-739f-4845-939a-9a8d213b86e6/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 963.011822] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Fetch image to [datastore1] vmware_temp/47884837-739f-4845-939a-9a8d213b86e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 963.011990] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/47884837-739f-4845-939a-9a8d213b86e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 963.012750] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0c0d540-6cc4-4670-9392-6eff2ff0b779 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.019676] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eb97449-81e5-4696-8687-3d7f3e222342 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.028711] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a313fcfb-b996-40dd-8b49-405ab7c501ce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.062520] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e711bf5-36aa-40f5-8d50-18131d43e98c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.069304] env[68964]: DEBUG oslo_vmware.api [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Task: {'id': task-3431593, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08176} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 963.070789] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 963.070982] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 963.071173] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 963.071346] env[68964]: INFO nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 963.073175] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c7b3ef3b-340e-4afd-ab8a-de6364c81998 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.075018] env[68964]: DEBUG nova.compute.claims [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 963.075191] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 963.075397] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 963.100358] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 963.154256] env[68964]: DEBUG oslo_vmware.rw_handles [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/47884837-739f-4845-939a-9a8d213b86e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 963.216660] env[68964]: DEBUG oslo_vmware.rw_handles [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 963.216660] env[68964]: DEBUG oslo_vmware.rw_handles [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/47884837-739f-4845-939a-9a8d213b86e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 963.484636] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04a1ecbf-e08a-4896-b13b-17a12fa2aa0b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.491819] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7019b0c-5b76-4917-be78-d379271e9693 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.522685] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5001bd0-f524-452b-a4ef-53c1fcac2e3b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.529603] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4a93ac5-d56b-4ce1-ac1e-37f2c40df785 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.542615] env[68964]: DEBUG nova.compute.provider_tree [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 963.551032] env[68964]: DEBUG nova.scheduler.client.report [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 963.566704] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.491s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 963.567266] env[68964]: ERROR nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 963.567266] env[68964]: Faults: ['InvalidArgument'] [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] Traceback (most recent call last): [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] self.driver.spawn(context, instance, image_meta, [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] self._fetch_image_if_missing(context, vi) [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] image_cache(vi, tmp_image_ds_loc) [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] vm_util.copy_virtual_disk( [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] session._wait_for_task(vmdk_copy_task) [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] return self.wait_for_task(task_ref) [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] return evt.wait() [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] result = hub.switch() [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] return self.greenlet.switch() [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] self.f(*self.args, **self.kw) [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] raise exceptions.translate_fault(task_info.error) [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] Faults: ['InvalidArgument'] [ 963.567266] env[68964]: ERROR nova.compute.manager [instance: 323acb55-859a-4545-a046-1934cf98be6d] [ 963.568206] env[68964]: DEBUG nova.compute.utils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 963.569826] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Build of instance 323acb55-859a-4545-a046-1934cf98be6d was re-scheduled: A specified parameter was not correct: fileType [ 963.569826] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 963.570224] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 963.570398] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 963.570579] env[68964]: DEBUG nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 963.570756] env[68964]: DEBUG nova.network.neutron [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 963.725222] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 963.725222] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 963.736113] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 963.736674] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 963.737022] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 963.738090] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 963.738723] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49c7825c-2d6f-46f9-97cd-1a2060f4acf2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.748171] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22152860-3ac4-46ba-a148-9d0d5486c8ce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.766019] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c3ae7ce-e1f0-42ff-b7f1-a4f8312b48d4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.772145] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fbee399-213b-48f7-bd92-e9ea7aff2df5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.804335] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180938MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 963.804758] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 963.805142] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 963.876961] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 963.877147] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 963.877515] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 329835df-cb38-495e-8a0e-539a396ddc74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 963.877515] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 963.877515] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ff09bdbb-84e3-4182-8118-e99512a0e9de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 963.893644] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 323acb55-859a-4545-a046-1934cf98be6d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.893812] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fe587d8a-aa99-4163-a2a2-a97c0cfbb82a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 963.893935] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f37297e4-80b0-4c1d-b427-6a6a235bdc57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 963.894070] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 963.894192] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3770333e-4721-424d-ac86-2291c002e99a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 963.905117] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.915089] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.927887] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f3f326c-2127-426e-a137-6f33512f4cb2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.943421] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.950496] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 238794bb-9995-4bb0-954d-7ca0ef825e19 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.961038] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8b11bda8-2923-4641-869b-39e4fce369b4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.969425] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance d7af5b42-41ea-4cac-b9ae-66ec4b6d25f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.979503] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 120c1330-9cdf-4db2-8c9f-1fa08dcad359 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.989949] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4d272615-e2dd-4540-88d0-4a209f559147 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 963.999707] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 964.011102] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 66318915-69a7-4f3a-8aa2-377948732cc5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 964.021284] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c076ffe-9532-4d57-b044-a74a48cb147d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 964.035132] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 41317213-a0f2-42fc-9e44-dfe83d27a811 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 964.045917] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 864ec33b-2840-4ed3-b0b6-2ef062141705 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 964.061171] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 57dbb1fb-8a62-4127-ac5c-4d7bf1fa2a52 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 964.071958] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c7853bb3-fa53-4911-818f-e03245ad3a0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 964.086720] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 19f90c65-2865-4fa7-b647-f69fd217e1e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 964.097472] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c1efc344-848b-4a98-a20a-57ebdfb5ac8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 964.097696] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 964.097837] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 964.198032] env[68964]: DEBUG nova.network.neutron [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 964.215811] env[68964]: INFO nova.compute.manager [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Took 0.64 seconds to deallocate network for instance. [ 964.405768] env[68964]: INFO nova.scheduler.client.report [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Deleted allocations for instance 323acb55-859a-4545-a046-1934cf98be6d [ 964.427780] env[68964]: DEBUG oslo_concurrency.lockutils [None req-66653a07-a682-4a47-847e-5117363f5f48 tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "323acb55-859a-4545-a046-1934cf98be6d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 279.394s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.429032] env[68964]: DEBUG oslo_concurrency.lockutils [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "323acb55-859a-4545-a046-1934cf98be6d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 81.017s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 964.429247] env[68964]: DEBUG oslo_concurrency.lockutils [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Acquiring lock "323acb55-859a-4545-a046-1934cf98be6d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 964.429446] env[68964]: DEBUG oslo_concurrency.lockutils [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "323acb55-859a-4545-a046-1934cf98be6d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 964.429660] env[68964]: DEBUG oslo_concurrency.lockutils [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "323acb55-859a-4545-a046-1934cf98be6d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.431856] env[68964]: INFO nova.compute.manager [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Terminating instance [ 964.433852] env[68964]: DEBUG nova.compute.manager [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 964.434250] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 964.437439] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-25e19415-5749-4592-b2ff-f44f7f102ef9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.441124] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 964.450345] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be8579df-f8ef-4013-acdb-8a28b397698d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.483427] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 323acb55-859a-4545-a046-1934cf98be6d could not be found. [ 964.484723] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 964.484723] env[68964]: INFO nova.compute.manager [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Took 0.05 seconds to destroy the instance on the hypervisor. [ 964.484723] env[68964]: DEBUG oslo.service.loopingcall [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 964.487164] env[68964]: DEBUG nova.compute.manager [-] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 964.487270] env[68964]: DEBUG nova.network.neutron [-] [instance: 323acb55-859a-4545-a046-1934cf98be6d] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 964.502757] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 964.529213] env[68964]: DEBUG nova.network.neutron [-] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 964.535486] env[68964]: INFO nova.compute.manager [-] [instance: 323acb55-859a-4545-a046-1934cf98be6d] Took 0.05 seconds to deallocate network for instance. [ 964.589420] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b895ce8-c5a6-4116-afa6-7cdbb1f21ae5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.596834] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b629b981-6d0a-4564-b481-2440e3993e49 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.630328] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dafee455-71c6-4454-9d9f-382b1bf3ceda {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.633037] env[68964]: DEBUG oslo_concurrency.lockutils [None req-87df06dc-7c5c-4476-bf15-ba64b15dcefd tempest-ServersTestManualDisk-1120400847 tempest-ServersTestManualDisk-1120400847-project-member] Lock "323acb55-859a-4545-a046-1934cf98be6d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.204s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.638968] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8679a93-c53d-403d-baef-969aee4a5343 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.653576] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 964.662878] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 964.685993] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 964.685993] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.881s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.687651] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.184s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 964.687830] env[68964]: INFO nova.compute.claims [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 965.066343] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea0575b-004f-4610-8e76-0dcc79ab5d3c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.074070] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e7c73ab-bf91-4d6d-9420-45c8066868ec {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.104388] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e22023d-3bbf-4358-8a21-7f374ba8f755 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.112558] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-842ff751-6a74-4f3a-b800-5d34c7dda28d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.126051] env[68964]: DEBUG nova.compute.provider_tree [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 965.135345] env[68964]: DEBUG nova.scheduler.client.report [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 965.153600] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.467s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 965.154208] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 965.194927] env[68964]: DEBUG nova.compute.utils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 965.196690] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 965.197029] env[68964]: DEBUG nova.network.neutron [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 965.204852] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 965.271705] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 965.297689] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:07:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5bb3ce25-7ad9-4005-bce9-b57b2f09fae6',id=38,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-312160221',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 965.297958] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 965.298130] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 965.298311] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 965.298453] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 965.298597] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 965.298803] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 965.298959] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 965.299145] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 965.299307] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 965.299479] env[68964]: DEBUG nova.virt.hardware [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 965.300396] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26d2ccce-5750-4ad5-a32e-d6d94b36203e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.308193] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31f7d18f-13cc-4954-ab38-41649eced458 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.331517] env[68964]: DEBUG nova.policy [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a4aced3d14d4dd786a654eacf697bae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4e7b4eec810a4475a868d421674362cf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 965.692224] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 965.692434] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 965.692490] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 965.720200] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.720365] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.720496] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.720622] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.720743] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.721056] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.721056] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.721221] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.721221] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.721325] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 965.721441] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 965.721945] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 965.722146] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 965.970474] env[68964]: DEBUG nova.network.neutron [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Successfully created port: 40c1b497-589b-4b89-90a8-dc5d0787588d {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 967.276798] env[68964]: DEBUG nova.compute.manager [req-01810d94-f6a1-446f-a1f0-909e7032a78d req-9ec50864-1b6a-47bf-a816-3af1754e11a0 service nova] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Received event network-vif-plugged-40c1b497-589b-4b89-90a8-dc5d0787588d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 967.277419] env[68964]: DEBUG oslo_concurrency.lockutils [req-01810d94-f6a1-446f-a1f0-909e7032a78d req-9ec50864-1b6a-47bf-a816-3af1754e11a0 service nova] Acquiring lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 967.277419] env[68964]: DEBUG oslo_concurrency.lockutils [req-01810d94-f6a1-446f-a1f0-909e7032a78d req-9ec50864-1b6a-47bf-a816-3af1754e11a0 service nova] Lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 967.277755] env[68964]: DEBUG oslo_concurrency.lockutils [req-01810d94-f6a1-446f-a1f0-909e7032a78d req-9ec50864-1b6a-47bf-a816-3af1754e11a0 service nova] Lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 967.277851] env[68964]: DEBUG nova.compute.manager [req-01810d94-f6a1-446f-a1f0-909e7032a78d req-9ec50864-1b6a-47bf-a816-3af1754e11a0 service nova] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] No waiting events found dispatching network-vif-plugged-40c1b497-589b-4b89-90a8-dc5d0787588d {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 967.280118] env[68964]: WARNING nova.compute.manager [req-01810d94-f6a1-446f-a1f0-909e7032a78d req-9ec50864-1b6a-47bf-a816-3af1754e11a0 service nova] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Received unexpected event network-vif-plugged-40c1b497-589b-4b89-90a8-dc5d0787588d for instance with vm_state building and task_state spawning. [ 967.323472] env[68964]: DEBUG nova.network.neutron [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Successfully updated port: 40c1b497-589b-4b89-90a8-dc5d0787588d {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 967.343633] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "refresh_cache-b2d9a7ec-f565-49d3-8d0d-9339504f8a86" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 967.343633] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquired lock "refresh_cache-b2d9a7ec-f565-49d3-8d0d-9339504f8a86" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 967.343633] env[68964]: DEBUG nova.network.neutron [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 967.388143] env[68964]: DEBUG nova.network.neutron [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 967.625219] env[68964]: DEBUG nova.network.neutron [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Updating instance_info_cache with network_info: [{"id": "40c1b497-589b-4b89-90a8-dc5d0787588d", "address": "fa:16:3e:46:67:e3", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.123", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap40c1b497-58", "ovs_interfaceid": "40c1b497-589b-4b89-90a8-dc5d0787588d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 967.642423] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Releasing lock "refresh_cache-b2d9a7ec-f565-49d3-8d0d-9339504f8a86" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 967.643329] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Instance network_info: |[{"id": "40c1b497-589b-4b89-90a8-dc5d0787588d", "address": "fa:16:3e:46:67:e3", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.123", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap40c1b497-58", "ovs_interfaceid": "40c1b497-589b-4b89-90a8-dc5d0787588d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 967.644047] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:46:67:e3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f9c4edd5-d88e-4996-afea-00130ace0dad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '40c1b497-589b-4b89-90a8-dc5d0787588d', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 967.655706] env[68964]: DEBUG oslo.service.loopingcall [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 967.656696] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 967.657496] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cc1f951a-393c-4278-b72c-e5636b51fa59 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 967.683403] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 967.683403] env[68964]: value = "task-3431594" [ 967.683403] env[68964]: _type = "Task" [ 967.683403] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 967.694189] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431594, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 968.195053] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431594, 'name': CreateVM_Task, 'duration_secs': 0.31194} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 968.195053] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 968.195694] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 968.195854] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 968.196200] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 968.196450] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-72d42f9a-d0f9-4677-9580-fb1ad7bd5696 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.201319] env[68964]: DEBUG oslo_vmware.api [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for the task: (returnval){ [ 968.201319] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52b1b81c-6d27-b3f5-75ea-d2d34c4ff519" [ 968.201319] env[68964]: _type = "Task" [ 968.201319] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 968.209323] env[68964]: DEBUG oslo_vmware.api [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52b1b81c-6d27-b3f5-75ea-d2d34c4ff519, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 968.712360] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 968.712671] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 968.712671] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 969.303137] env[68964]: DEBUG nova.compute.manager [req-ee2bad42-03ab-472d-8393-e21eb759ece7 req-731e3f26-c82e-4d43-9171-58163d765850 service nova] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Received event network-changed-40c1b497-589b-4b89-90a8-dc5d0787588d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 969.303572] env[68964]: DEBUG nova.compute.manager [req-ee2bad42-03ab-472d-8393-e21eb759ece7 req-731e3f26-c82e-4d43-9171-58163d765850 service nova] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Refreshing instance network info cache due to event network-changed-40c1b497-589b-4b89-90a8-dc5d0787588d. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 969.303924] env[68964]: DEBUG oslo_concurrency.lockutils [req-ee2bad42-03ab-472d-8393-e21eb759ece7 req-731e3f26-c82e-4d43-9171-58163d765850 service nova] Acquiring lock "refresh_cache-b2d9a7ec-f565-49d3-8d0d-9339504f8a86" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 969.304122] env[68964]: DEBUG oslo_concurrency.lockutils [req-ee2bad42-03ab-472d-8393-e21eb759ece7 req-731e3f26-c82e-4d43-9171-58163d765850 service nova] Acquired lock "refresh_cache-b2d9a7ec-f565-49d3-8d0d-9339504f8a86" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 969.304295] env[68964]: DEBUG nova.network.neutron [req-ee2bad42-03ab-472d-8393-e21eb759ece7 req-731e3f26-c82e-4d43-9171-58163d765850 service nova] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Refreshing network info cache for port 40c1b497-589b-4b89-90a8-dc5d0787588d {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 969.833727] env[68964]: DEBUG nova.network.neutron [req-ee2bad42-03ab-472d-8393-e21eb759ece7 req-731e3f26-c82e-4d43-9171-58163d765850 service nova] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Updated VIF entry in instance network info cache for port 40c1b497-589b-4b89-90a8-dc5d0787588d. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 969.833727] env[68964]: DEBUG nova.network.neutron [req-ee2bad42-03ab-472d-8393-e21eb759ece7 req-731e3f26-c82e-4d43-9171-58163d765850 service nova] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Updating instance_info_cache with network_info: [{"id": "40c1b497-589b-4b89-90a8-dc5d0787588d", "address": "fa:16:3e:46:67:e3", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.123", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap40c1b497-58", "ovs_interfaceid": "40c1b497-589b-4b89-90a8-dc5d0787588d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 969.848044] env[68964]: DEBUG oslo_concurrency.lockutils [req-ee2bad42-03ab-472d-8393-e21eb759ece7 req-731e3f26-c82e-4d43-9171-58163d765850 service nova] Releasing lock "refresh_cache-b2d9a7ec-f565-49d3-8d0d-9339504f8a86" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 971.181308] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquiring lock "3770333e-4721-424d-ac86-2291c002e99a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 972.615485] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquiring lock "244140d1-bf22-415a-b770-05f2fe106149" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 972.615485] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "244140d1-bf22-415a-b770-05f2fe106149" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 998.788420] env[68964]: WARNING oslo_vmware.rw_handles [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 998.788420] env[68964]: ERROR oslo_vmware.rw_handles [ 998.788898] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/b196a088-0055-4ad2-8847-3e8ccb41a4d3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 998.790764] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 998.791021] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Copying Virtual Disk [datastore2] vmware_temp/b196a088-0055-4ad2-8847-3e8ccb41a4d3/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/b196a088-0055-4ad2-8847-3e8ccb41a4d3/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 998.791303] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-633d2684-ace1-4769-9420-f6c7b6ad573c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.798602] env[68964]: DEBUG oslo_vmware.api [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Waiting for the task: (returnval){ [ 998.798602] env[68964]: value = "task-3431595" [ 998.798602] env[68964]: _type = "Task" [ 998.798602] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 998.807749] env[68964]: DEBUG oslo_vmware.api [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Task: {'id': task-3431595, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 999.287437] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 999.308758] env[68964]: DEBUG oslo_vmware.exceptions [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 999.309057] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 999.309638] env[68964]: ERROR nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 999.309638] env[68964]: Faults: ['InvalidArgument'] [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Traceback (most recent call last): [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] yield resources [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] self.driver.spawn(context, instance, image_meta, [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] self._vmops.spawn(context, instance, image_meta, injected_files, [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] self._fetch_image_if_missing(context, vi) [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] image_cache(vi, tmp_image_ds_loc) [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] vm_util.copy_virtual_disk( [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] session._wait_for_task(vmdk_copy_task) [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] return self.wait_for_task(task_ref) [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] return evt.wait() [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] result = hub.switch() [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] return self.greenlet.switch() [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] self.f(*self.args, **self.kw) [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] raise exceptions.translate_fault(task_info.error) [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Faults: ['InvalidArgument'] [ 999.309638] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] [ 999.310564] env[68964]: INFO nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Terminating instance [ 999.311674] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 999.311834] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 999.312082] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fd3e0f8a-49d2-49d4-826e-b87df6384b5f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.314256] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "refresh_cache-329835df-cb38-495e-8a0e-539a396ddc74" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 999.314293] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquired lock "refresh_cache-329835df-cb38-495e-8a0e-539a396ddc74" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 999.314468] env[68964]: DEBUG nova.network.neutron [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 999.321324] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 999.321498] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 999.322203] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-30c6b65d-8e7b-4cad-bae3-a9007f306f2b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.330131] env[68964]: DEBUG oslo_vmware.api [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Waiting for the task: (returnval){ [ 999.330131] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]522d925d-cbe8-3707-275d-6a0df2e6a4bc" [ 999.330131] env[68964]: _type = "Task" [ 999.330131] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 999.337639] env[68964]: DEBUG oslo_vmware.api [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]522d925d-cbe8-3707-275d-6a0df2e6a4bc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 999.348686] env[68964]: DEBUG nova.network.neutron [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 999.430965] env[68964]: DEBUG nova.network.neutron [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 999.439931] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Releasing lock "refresh_cache-329835df-cb38-495e-8a0e-539a396ddc74" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 999.440365] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 999.440617] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 999.441701] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb69f9bc-84e7-4140-99ae-00b6e8037c2d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.449490] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 999.449711] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d3737e6c-7775-41f3-89a1-72c612e83ed9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.480106] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 999.480326] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 999.480574] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Deleting the datastore file [datastore2] 329835df-cb38-495e-8a0e-539a396ddc74 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 999.480817] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f0c5e78d-b2fd-4faa-91df-2d689755b47c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.487153] env[68964]: DEBUG oslo_vmware.api [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Waiting for the task: (returnval){ [ 999.487153] env[68964]: value = "task-3431597" [ 999.487153] env[68964]: _type = "Task" [ 999.487153] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 999.494609] env[68964]: DEBUG oslo_vmware.api [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Task: {'id': task-3431597, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 999.840944] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 999.842413] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Creating directory with path [datastore2] vmware_temp/1771a315-8f53-49e0-a3e5-3afea85b9633/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 999.842413] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-01e1e628-270d-42a5-838d-d176d5c53998 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.853933] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Created directory with path [datastore2] vmware_temp/1771a315-8f53-49e0-a3e5-3afea85b9633/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 999.853933] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Fetch image to [datastore2] vmware_temp/1771a315-8f53-49e0-a3e5-3afea85b9633/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 999.854151] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/1771a315-8f53-49e0-a3e5-3afea85b9633/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 999.854733] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2efebea0-6add-4993-a7d7-91eb8b8a08c2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.862146] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb900d5d-9cb5-4fb9-8e90-6f13a352fc73 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.871610] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67e171a1-870b-438b-bf2a-cfea5c235db2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.904426] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efe201bf-1f3a-44d6-91ad-891c27c59f08 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.910918] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a0eaea91-57a8-4025-b83d-b6c36da0c204 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 999.940643] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 999.996227] env[68964]: DEBUG oslo_vmware.api [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Task: {'id': task-3431597, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.0418} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 999.998124] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 999.998319] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 999.998753] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 999.998753] env[68964]: INFO nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Took 0.56 seconds to destroy the instance on the hypervisor. [ 999.998963] env[68964]: DEBUG oslo.service.loopingcall [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 999.999716] env[68964]: DEBUG nova.compute.manager [-] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Skipping network deallocation for instance since networking was not requested. {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1000.001485] env[68964]: DEBUG nova.compute.claims [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1000.001671] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.001884] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1000.011827] env[68964]: DEBUG oslo_vmware.rw_handles [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1771a315-8f53-49e0-a3e5-3afea85b9633/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1000.072489] env[68964]: DEBUG oslo_vmware.rw_handles [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1000.072713] env[68964]: DEBUG oslo_vmware.rw_handles [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1771a315-8f53-49e0-a3e5-3afea85b9633/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1000.404681] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-533cc548-6cd0-41a2-bebc-5dbdcffff54a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.413033] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddcf2ddd-f65b-4e99-8b69-23bfbfbe7efa {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.442578] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-961f8b42-9e68-45a8-8f61-630702928efc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.450157] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8958cbd-9841-4cee-8fcf-161e929bcba8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.464063] env[68964]: DEBUG nova.compute.provider_tree [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1000.472306] env[68964]: DEBUG nova.scheduler.client.report [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1000.486057] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.484s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1000.486588] env[68964]: ERROR nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1000.486588] env[68964]: Faults: ['InvalidArgument'] [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Traceback (most recent call last): [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] self.driver.spawn(context, instance, image_meta, [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] self._fetch_image_if_missing(context, vi) [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] image_cache(vi, tmp_image_ds_loc) [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] vm_util.copy_virtual_disk( [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] session._wait_for_task(vmdk_copy_task) [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] return self.wait_for_task(task_ref) [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] return evt.wait() [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] result = hub.switch() [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] return self.greenlet.switch() [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] self.f(*self.args, **self.kw) [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] raise exceptions.translate_fault(task_info.error) [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Faults: ['InvalidArgument'] [ 1000.486588] env[68964]: ERROR nova.compute.manager [instance: 329835df-cb38-495e-8a0e-539a396ddc74] [ 1000.487291] env[68964]: DEBUG nova.compute.utils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1000.488635] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Build of instance 329835df-cb38-495e-8a0e-539a396ddc74 was re-scheduled: A specified parameter was not correct: fileType [ 1000.488635] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1000.489028] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1000.489262] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "refresh_cache-329835df-cb38-495e-8a0e-539a396ddc74" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1000.489408] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquired lock "refresh_cache-329835df-cb38-495e-8a0e-539a396ddc74" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1000.489565] env[68964]: DEBUG nova.network.neutron [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1000.519358] env[68964]: DEBUG nova.network.neutron [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1000.610435] env[68964]: DEBUG nova.network.neutron [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1000.621121] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Releasing lock "refresh_cache-329835df-cb38-495e-8a0e-539a396ddc74" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1000.621121] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1000.621121] env[68964]: DEBUG nova.compute.manager [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Skipping network deallocation for instance since networking was not requested. {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1000.710853] env[68964]: INFO nova.scheduler.client.report [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Deleted allocations for instance 329835df-cb38-495e-8a0e-539a396ddc74 [ 1000.730948] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f68aa7db-94b6-484d-adfe-1ecf604e74e3 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "329835df-cb38-495e-8a0e-539a396ddc74" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 328.635s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1000.732107] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "329835df-cb38-495e-8a0e-539a396ddc74" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 124.978s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1000.732107] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "329835df-cb38-495e-8a0e-539a396ddc74-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.732107] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "329835df-cb38-495e-8a0e-539a396ddc74-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1000.732233] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "329835df-cb38-495e-8a0e-539a396ddc74-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1000.734035] env[68964]: INFO nova.compute.manager [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Terminating instance [ 1000.735515] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquiring lock "refresh_cache-329835df-cb38-495e-8a0e-539a396ddc74" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1000.735703] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Acquired lock "refresh_cache-329835df-cb38-495e-8a0e-539a396ddc74" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1000.735814] env[68964]: DEBUG nova.network.neutron [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1000.746745] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1000.763979] env[68964]: DEBUG nova.network.neutron [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1000.799475] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.799719] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1000.801197] env[68964]: INFO nova.compute.claims [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1000.834764] env[68964]: DEBUG nova.network.neutron [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1000.843736] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Releasing lock "refresh_cache-329835df-cb38-495e-8a0e-539a396ddc74" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1000.844123] env[68964]: DEBUG nova.compute.manager [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1000.844321] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1000.844814] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b1cebbc1-6fc7-4730-b650-41ee284ddeb0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.859092] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0daa90bd-e398-4a25-8ed8-2aa686106f52 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1000.890382] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 329835df-cb38-495e-8a0e-539a396ddc74 could not be found. [ 1000.890609] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1000.890789] env[68964]: INFO nova.compute.manager [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1000.891044] env[68964]: DEBUG oslo.service.loopingcall [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1000.893516] env[68964]: DEBUG nova.compute.manager [-] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1000.893608] env[68964]: DEBUG nova.network.neutron [-] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1000.921835] env[68964]: DEBUG nova.network.neutron [-] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1000.930104] env[68964]: DEBUG nova.network.neutron [-] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1000.942243] env[68964]: INFO nova.compute.manager [-] [instance: 329835df-cb38-495e-8a0e-539a396ddc74] Took 0.05 seconds to deallocate network for instance. [ 1001.035687] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6c0189a2-5aeb-4487-a115-80e959307e37 tempest-ServersAdmin275Test-791112424 tempest-ServersAdmin275Test-791112424-project-member] Lock "329835df-cb38-495e-8a0e-539a396ddc74" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.304s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1001.186902] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b502e5c6-c95b-4634-aa94-b7bfc2b72c9e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.191679] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67bc2ac9-56f4-4e82-a91d-3b991c33676a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.222360] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc6b11c8-483c-4eca-a429-b7aa0a3e8086 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.229748] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0132f550-560d-44a5-a08b-55381e08ac9b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.244119] env[68964]: DEBUG nova.compute.provider_tree [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1001.254565] env[68964]: DEBUG nova.scheduler.client.report [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1001.268341] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.468s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1001.268829] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1001.301277] env[68964]: DEBUG nova.compute.utils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1001.304330] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1001.304330] env[68964]: DEBUG nova.network.neutron [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1001.312159] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1001.381995] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1001.386914] env[68964]: DEBUG nova.policy [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e4e2fdc38308474fa90cc324dfe1b6f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '217178a834024f5a86365c3c4d8ca9b5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1001.411045] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1001.411305] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1001.411777] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1001.412016] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1001.412176] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1001.412322] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1001.412531] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1001.412899] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1001.413131] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1001.413308] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1001.413727] env[68964]: DEBUG nova.virt.hardware [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1001.414715] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de7338f6-72dd-448a-8992-5cdf9365886d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.423105] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29d36fe8-7441-4cef-95df-463b65eb32cf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.863101] env[68964]: DEBUG nova.network.neutron [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Successfully created port: 848d056c-c20b-4ce8-aaba-e32b7e350b08 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1002.552479] env[68964]: DEBUG nova.compute.manager [req-223593a3-a2cd-4470-ba04-0096d0611d29 req-a9cb2907-ab05-4b4c-abd9-5e9f66d05a01 service nova] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Received event network-vif-plugged-848d056c-c20b-4ce8-aaba-e32b7e350b08 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1002.552735] env[68964]: DEBUG oslo_concurrency.lockutils [req-223593a3-a2cd-4470-ba04-0096d0611d29 req-a9cb2907-ab05-4b4c-abd9-5e9f66d05a01 service nova] Acquiring lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1002.552885] env[68964]: DEBUG oslo_concurrency.lockutils [req-223593a3-a2cd-4470-ba04-0096d0611d29 req-a9cb2907-ab05-4b4c-abd9-5e9f66d05a01 service nova] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1002.553326] env[68964]: DEBUG oslo_concurrency.lockutils [req-223593a3-a2cd-4470-ba04-0096d0611d29 req-a9cb2907-ab05-4b4c-abd9-5e9f66d05a01 service nova] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1002.553963] env[68964]: DEBUG nova.compute.manager [req-223593a3-a2cd-4470-ba04-0096d0611d29 req-a9cb2907-ab05-4b4c-abd9-5e9f66d05a01 service nova] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] No waiting events found dispatching network-vif-plugged-848d056c-c20b-4ce8-aaba-e32b7e350b08 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1002.553963] env[68964]: WARNING nova.compute.manager [req-223593a3-a2cd-4470-ba04-0096d0611d29 req-a9cb2907-ab05-4b4c-abd9-5e9f66d05a01 service nova] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Received unexpected event network-vif-plugged-848d056c-c20b-4ce8-aaba-e32b7e350b08 for instance with vm_state building and task_state spawning. [ 1002.598087] env[68964]: DEBUG nova.network.neutron [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Successfully updated port: 848d056c-c20b-4ce8-aaba-e32b7e350b08 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1002.612484] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "refresh_cache-8e77ed0b-ea43-4c15-94de-63c4e9d5e048" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1002.612558] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquired lock "refresh_cache-8e77ed0b-ea43-4c15-94de-63c4e9d5e048" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1002.612670] env[68964]: DEBUG nova.network.neutron [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1002.654381] env[68964]: DEBUG nova.network.neutron [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1003.072684] env[68964]: DEBUG nova.network.neutron [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Updating instance_info_cache with network_info: [{"id": "848d056c-c20b-4ce8-aaba-e32b7e350b08", "address": "fa:16:3e:12:4a:b0", "network": {"id": "6a7fda40-db49-452a-b342-2cda28f5876b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-800300419-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "217178a834024f5a86365c3c4d8ca9b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap848d056c-c2", "ovs_interfaceid": "848d056c-c20b-4ce8-aaba-e32b7e350b08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1003.087666] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Releasing lock "refresh_cache-8e77ed0b-ea43-4c15-94de-63c4e9d5e048" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1003.087965] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Instance network_info: |[{"id": "848d056c-c20b-4ce8-aaba-e32b7e350b08", "address": "fa:16:3e:12:4a:b0", "network": {"id": "6a7fda40-db49-452a-b342-2cda28f5876b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-800300419-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "217178a834024f5a86365c3c4d8ca9b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap848d056c-c2", "ovs_interfaceid": "848d056c-c20b-4ce8-aaba-e32b7e350b08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1003.088385] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:12:4a:b0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '21310d90-efbc-45a8-a97f-c4358606530f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '848d056c-c20b-4ce8-aaba-e32b7e350b08', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1003.096041] env[68964]: DEBUG oslo.service.loopingcall [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1003.096535] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1003.096763] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d662e2ff-0cbb-4e76-9c07-04391c07839d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1003.117695] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1003.117695] env[68964]: value = "task-3431598" [ 1003.117695] env[68964]: _type = "Task" [ 1003.117695] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1003.125563] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431598, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1003.627507] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431598, 'name': CreateVM_Task, 'duration_secs': 0.292794} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1003.627711] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1003.628427] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1003.628590] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1003.628904] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1003.629167] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-544f6900-86b9-4932-8450-be1867c4584f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1003.633681] env[68964]: DEBUG oslo_vmware.api [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for the task: (returnval){ [ 1003.633681] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5212d190-08c9-1a4a-0a16-b1f1412ba754" [ 1003.633681] env[68964]: _type = "Task" [ 1003.633681] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1003.641219] env[68964]: DEBUG oslo_vmware.api [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5212d190-08c9-1a4a-0a16-b1f1412ba754, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1004.149868] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1004.149868] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1004.149868] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1004.597920] env[68964]: DEBUG nova.compute.manager [req-bd629f44-1e69-499e-82c7-8b31671350a0 req-a78b9ed7-2f03-4dd0-b75f-449b49ab9fcd service nova] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Received event network-changed-848d056c-c20b-4ce8-aaba-e32b7e350b08 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1004.598134] env[68964]: DEBUG nova.compute.manager [req-bd629f44-1e69-499e-82c7-8b31671350a0 req-a78b9ed7-2f03-4dd0-b75f-449b49ab9fcd service nova] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Refreshing instance network info cache due to event network-changed-848d056c-c20b-4ce8-aaba-e32b7e350b08. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1004.598344] env[68964]: DEBUG oslo_concurrency.lockutils [req-bd629f44-1e69-499e-82c7-8b31671350a0 req-a78b9ed7-2f03-4dd0-b75f-449b49ab9fcd service nova] Acquiring lock "refresh_cache-8e77ed0b-ea43-4c15-94de-63c4e9d5e048" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1004.598563] env[68964]: DEBUG oslo_concurrency.lockutils [req-bd629f44-1e69-499e-82c7-8b31671350a0 req-a78b9ed7-2f03-4dd0-b75f-449b49ab9fcd service nova] Acquired lock "refresh_cache-8e77ed0b-ea43-4c15-94de-63c4e9d5e048" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1004.598681] env[68964]: DEBUG nova.network.neutron [req-bd629f44-1e69-499e-82c7-8b31671350a0 req-a78b9ed7-2f03-4dd0-b75f-449b49ab9fcd service nova] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Refreshing network info cache for port 848d056c-c20b-4ce8-aaba-e32b7e350b08 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1005.348297] env[68964]: DEBUG nova.network.neutron [req-bd629f44-1e69-499e-82c7-8b31671350a0 req-a78b9ed7-2f03-4dd0-b75f-449b49ab9fcd service nova] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Updated VIF entry in instance network info cache for port 848d056c-c20b-4ce8-aaba-e32b7e350b08. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1005.348297] env[68964]: DEBUG nova.network.neutron [req-bd629f44-1e69-499e-82c7-8b31671350a0 req-a78b9ed7-2f03-4dd0-b75f-449b49ab9fcd service nova] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Updating instance_info_cache with network_info: [{"id": "848d056c-c20b-4ce8-aaba-e32b7e350b08", "address": "fa:16:3e:12:4a:b0", "network": {"id": "6a7fda40-db49-452a-b342-2cda28f5876b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-800300419-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "217178a834024f5a86365c3c4d8ca9b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap848d056c-c2", "ovs_interfaceid": "848d056c-c20b-4ce8-aaba-e32b7e350b08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1005.360194] env[68964]: DEBUG oslo_concurrency.lockutils [req-bd629f44-1e69-499e-82c7-8b31671350a0 req-a78b9ed7-2f03-4dd0-b75f-449b49ab9fcd service nova] Releasing lock "refresh_cache-8e77ed0b-ea43-4c15-94de-63c4e9d5e048" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1011.519074] env[68964]: WARNING oslo_vmware.rw_handles [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1011.519074] env[68964]: ERROR oslo_vmware.rw_handles [ 1011.519663] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/47884837-739f-4845-939a-9a8d213b86e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1011.521122] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1011.521355] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Copying Virtual Disk [datastore1] vmware_temp/47884837-739f-4845-939a-9a8d213b86e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/47884837-739f-4845-939a-9a8d213b86e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1011.521635] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-73412c32-d278-4b46-b4f3-2717ba2b33f7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.529796] env[68964]: DEBUG oslo_vmware.api [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for the task: (returnval){ [ 1011.529796] env[68964]: value = "task-3431603" [ 1011.529796] env[68964]: _type = "Task" [ 1011.529796] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1011.537916] env[68964]: DEBUG oslo_vmware.api [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': task-3431603, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1012.040482] env[68964]: DEBUG oslo_vmware.exceptions [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1012.040813] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1012.041369] env[68964]: ERROR nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1012.041369] env[68964]: Faults: ['InvalidArgument'] [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Traceback (most recent call last): [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] yield resources [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] self.driver.spawn(context, instance, image_meta, [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] self._fetch_image_if_missing(context, vi) [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] image_cache(vi, tmp_image_ds_loc) [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] vm_util.copy_virtual_disk( [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] session._wait_for_task(vmdk_copy_task) [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] return self.wait_for_task(task_ref) [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] return evt.wait() [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] result = hub.switch() [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] return self.greenlet.switch() [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] self.f(*self.args, **self.kw) [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] raise exceptions.translate_fault(task_info.error) [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Faults: ['InvalidArgument'] [ 1012.041369] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] [ 1012.042292] env[68964]: INFO nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Terminating instance [ 1012.043202] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1012.043405] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1012.043667] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e1cbc432-9091-414e-b3c7-addfb3738995 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.045935] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1012.046214] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1012.046940] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fb856b8-206b-4277-8f2e-90f31752aeb2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.053779] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1012.054742] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2cced040-a20a-4807-8948-d6ece57d1d1b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.056130] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1012.056318] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1012.056979] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-caf006a8-d5f3-4d4b-9422-3e2ecb0990c9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.062132] env[68964]: DEBUG oslo_vmware.api [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Waiting for the task: (returnval){ [ 1012.062132] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]520bfee7-722c-cb2a-bbde-16b6b6ac9883" [ 1012.062132] env[68964]: _type = "Task" [ 1012.062132] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1012.069278] env[68964]: DEBUG oslo_vmware.api [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]520bfee7-722c-cb2a-bbde-16b6b6ac9883, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1012.137582] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1012.137826] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1012.138041] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Deleting the datastore file [datastore1] fe587d8a-aa99-4163-a2a2-a97c0cfbb82a {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1012.138317] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fbfe1f9f-f7ac-486b-a159-d2b53dcad77b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.145224] env[68964]: DEBUG oslo_vmware.api [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for the task: (returnval){ [ 1012.145224] env[68964]: value = "task-3431605" [ 1012.145224] env[68964]: _type = "Task" [ 1012.145224] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1012.153203] env[68964]: DEBUG oslo_vmware.api [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': task-3431605, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1012.571933] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1012.571933] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Creating directory with path [datastore1] vmware_temp/adec52ee-1e89-47ef-b518-259610daf9e6/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1012.572305] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-89dc7343-0432-43a9-a5bd-30c2f3fe545b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.582723] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Created directory with path [datastore1] vmware_temp/adec52ee-1e89-47ef-b518-259610daf9e6/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1012.582926] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Fetch image to [datastore1] vmware_temp/adec52ee-1e89-47ef-b518-259610daf9e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1012.583161] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/adec52ee-1e89-47ef-b518-259610daf9e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1012.583873] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26b24915-8bb2-418e-af0b-abe19a6e2d55 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.590300] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34012e3a-2840-4af9-9dfd-18e2148c9fc9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.599278] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-010f7c4d-13c2-4c4c-ab6d-e6c587993b4e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.630851] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1560d786-84e1-403b-aff1-82b66a008d93 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.636489] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-106f2983-f2b8-4fde-8291-dbde79b20732 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.653566] env[68964]: DEBUG oslo_vmware.api [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': task-3431605, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063788} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1012.653813] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1012.653997] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1012.654183] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1012.654374] env[68964]: INFO nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1012.656631] env[68964]: DEBUG nova.compute.claims [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1012.656800] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1012.657088] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1012.661764] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1012.713248] env[68964]: DEBUG oslo_vmware.rw_handles [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/adec52ee-1e89-47ef-b518-259610daf9e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1012.781418] env[68964]: DEBUG oslo_vmware.rw_handles [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1012.781596] env[68964]: DEBUG oslo_vmware.rw_handles [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/adec52ee-1e89-47ef-b518-259610daf9e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1013.082319] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0a6cacf-3d49-4598-b299-7efa6b4d4ae6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.089622] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26bcf04a-f598-45c7-bd2e-b2372201de99 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.119326] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-070cd00d-7535-46b5-91cd-1a027ed3abc6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.126300] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f29d827-b21b-4426-a6ea-af5ea30b2b37 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.139491] env[68964]: DEBUG nova.compute.provider_tree [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1013.148702] env[68964]: DEBUG nova.scheduler.client.report [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1013.163659] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.506s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1013.164256] env[68964]: ERROR nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1013.164256] env[68964]: Faults: ['InvalidArgument'] [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Traceback (most recent call last): [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] self.driver.spawn(context, instance, image_meta, [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] self._fetch_image_if_missing(context, vi) [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] image_cache(vi, tmp_image_ds_loc) [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] vm_util.copy_virtual_disk( [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] session._wait_for_task(vmdk_copy_task) [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] return self.wait_for_task(task_ref) [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] return evt.wait() [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] result = hub.switch() [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] return self.greenlet.switch() [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] self.f(*self.args, **self.kw) [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] raise exceptions.translate_fault(task_info.error) [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Faults: ['InvalidArgument'] [ 1013.164256] env[68964]: ERROR nova.compute.manager [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] [ 1013.165123] env[68964]: DEBUG nova.compute.utils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1013.166416] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Build of instance fe587d8a-aa99-4163-a2a2-a97c0cfbb82a was re-scheduled: A specified parameter was not correct: fileType [ 1013.166416] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1013.166802] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1013.166974] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1013.167164] env[68964]: DEBUG nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1013.167326] env[68964]: DEBUG nova.network.neutron [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1013.449478] env[68964]: DEBUG nova.network.neutron [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1013.467076] env[68964]: INFO nova.compute.manager [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Took 0.30 seconds to deallocate network for instance. [ 1013.574527] env[68964]: INFO nova.scheduler.client.report [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Deleted allocations for instance fe587d8a-aa99-4163-a2a2-a97c0cfbb82a [ 1013.599196] env[68964]: DEBUG oslo_concurrency.lockutils [None req-afe7f25b-3262-43b4-8e30-3f287d038054 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 327.865s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1013.600481] env[68964]: DEBUG oslo_concurrency.lockutils [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 128.935s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1013.600607] env[68964]: DEBUG oslo_concurrency.lockutils [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1013.601674] env[68964]: DEBUG oslo_concurrency.lockutils [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1013.601674] env[68964]: DEBUG oslo_concurrency.lockutils [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1013.604095] env[68964]: INFO nova.compute.manager [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Terminating instance [ 1013.606284] env[68964]: DEBUG nova.compute.manager [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1013.608526] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1013.608526] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7b0490f3-d8c6-40cf-ae31-9bf3eb4bab24 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.617737] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef802b7b-ad39-437b-abcf-b2095b157973 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.629547] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1013.651037] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fe587d8a-aa99-4163-a2a2-a97c0cfbb82a could not be found. [ 1013.651128] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1013.651267] env[68964]: INFO nova.compute.manager [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1013.651584] env[68964]: DEBUG oslo.service.loopingcall [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1013.651739] env[68964]: DEBUG nova.compute.manager [-] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1013.651883] env[68964]: DEBUG nova.network.neutron [-] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1013.679630] env[68964]: DEBUG nova.network.neutron [-] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1013.688598] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1013.688877] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1013.690476] env[68964]: INFO nova.compute.claims [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1013.693659] env[68964]: INFO nova.compute.manager [-] [instance: fe587d8a-aa99-4163-a2a2-a97c0cfbb82a] Took 0.04 seconds to deallocate network for instance. [ 1013.788380] env[68964]: DEBUG oslo_concurrency.lockutils [None req-521bc857-f9b6-4a57-9aa4-0adeed360f3d tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "fe587d8a-aa99-4163-a2a2-a97c0cfbb82a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.188s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1014.030580] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac1aa021-b106-4074-9c25-d35c8434a457 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.038104] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a14972c8-27de-4257-b840-4fc443296581 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.069907] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04961c64-55b8-4ec1-9854-97795fb69080 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.077222] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc71e43b-e81a-4e26-9e7e-d582a8de86be {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.090358] env[68964]: DEBUG nova.compute.provider_tree [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1014.099442] env[68964]: DEBUG nova.scheduler.client.report [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1014.114538] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.426s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1014.115013] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1014.149451] env[68964]: DEBUG nova.compute.utils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1014.150538] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1014.150769] env[68964]: DEBUG nova.network.neutron [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1014.158941] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1014.220602] env[68964]: DEBUG nova.policy [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3c4c788b1bac47f3a0c88659674aac04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a3853f564fd477e93d834349f4b34c9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1014.232184] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1014.264026] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1014.264286] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1014.264441] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1014.264621] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1014.264767] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1014.264909] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1014.265133] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1014.265297] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1014.265458] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1014.265615] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1014.265784] env[68964]: DEBUG nova.virt.hardware [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1014.266999] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6edb2ec3-4686-4b56-abd2-c312272c9456 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.275490] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c126864-6cec-4fbe-8192-3f21e1cedb2e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.580389] env[68964]: DEBUG nova.network.neutron [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Successfully created port: a5adea0b-67c6-454f-ac34-63a045662c47 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1014.631244] env[68964]: DEBUG oslo_concurrency.lockutils [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "ff09bdbb-84e3-4182-8118-e99512a0e9de" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.428337] env[68964]: DEBUG nova.network.neutron [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Successfully updated port: a5adea0b-67c6-454f-ac34-63a045662c47 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1015.443129] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquiring lock "refresh_cache-7f3f326c-2127-426e-a137-6f33512f4cb2" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1015.443287] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquired lock "refresh_cache-7f3f326c-2127-426e-a137-6f33512f4cb2" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1015.443435] env[68964]: DEBUG nova.network.neutron [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1015.504463] env[68964]: DEBUG nova.network.neutron [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1015.533399] env[68964]: DEBUG nova.compute.manager [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Received event network-vif-plugged-a5adea0b-67c6-454f-ac34-63a045662c47 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1015.533612] env[68964]: DEBUG oslo_concurrency.lockutils [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] Acquiring lock "7f3f326c-2127-426e-a137-6f33512f4cb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.533817] env[68964]: DEBUG oslo_concurrency.lockutils [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] Lock "7f3f326c-2127-426e-a137-6f33512f4cb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.534028] env[68964]: DEBUG oslo_concurrency.lockutils [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] Lock "7f3f326c-2127-426e-a137-6f33512f4cb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.534202] env[68964]: DEBUG nova.compute.manager [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] No waiting events found dispatching network-vif-plugged-a5adea0b-67c6-454f-ac34-63a045662c47 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1015.534365] env[68964]: WARNING nova.compute.manager [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Received unexpected event network-vif-plugged-a5adea0b-67c6-454f-ac34-63a045662c47 for instance with vm_state building and task_state spawning. [ 1015.534517] env[68964]: DEBUG nova.compute.manager [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Received event network-changed-a5adea0b-67c6-454f-ac34-63a045662c47 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1015.534664] env[68964]: DEBUG nova.compute.manager [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Refreshing instance network info cache due to event network-changed-a5adea0b-67c6-454f-ac34-63a045662c47. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1015.534825] env[68964]: DEBUG oslo_concurrency.lockutils [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] Acquiring lock "refresh_cache-7f3f326c-2127-426e-a137-6f33512f4cb2" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1015.713201] env[68964]: DEBUG nova.network.neutron [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Updating instance_info_cache with network_info: [{"id": "a5adea0b-67c6-454f-ac34-63a045662c47", "address": "fa:16:3e:b4:94:61", "network": {"id": "796c15a7-5f1f-4673-a307-bc84a0a6caca", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2138848524-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a3853f564fd477e93d834349f4b34c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "be8bd197-4b2b-46e7-88ea-2554b0438584", "external-id": "nsx-vlan-transportzone-338", "segmentation_id": 338, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa5adea0b-67", "ovs_interfaceid": "a5adea0b-67c6-454f-ac34-63a045662c47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1015.726998] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Releasing lock "refresh_cache-7f3f326c-2127-426e-a137-6f33512f4cb2" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1015.727314] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Instance network_info: |[{"id": "a5adea0b-67c6-454f-ac34-63a045662c47", "address": "fa:16:3e:b4:94:61", "network": {"id": "796c15a7-5f1f-4673-a307-bc84a0a6caca", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2138848524-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a3853f564fd477e93d834349f4b34c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "be8bd197-4b2b-46e7-88ea-2554b0438584", "external-id": "nsx-vlan-transportzone-338", "segmentation_id": 338, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa5adea0b-67", "ovs_interfaceid": "a5adea0b-67c6-454f-ac34-63a045662c47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1015.727613] env[68964]: DEBUG oslo_concurrency.lockutils [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] Acquired lock "refresh_cache-7f3f326c-2127-426e-a137-6f33512f4cb2" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1015.727785] env[68964]: DEBUG nova.network.neutron [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Refreshing network info cache for port a5adea0b-67c6-454f-ac34-63a045662c47 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1015.731031] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b4:94:61', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'be8bd197-4b2b-46e7-88ea-2554b0438584', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a5adea0b-67c6-454f-ac34-63a045662c47', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1015.737305] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Creating folder: Project (7a3853f564fd477e93d834349f4b34c9). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1015.737979] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e60bcca5-9464-44df-8a2e-97ef0135c276 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.751361] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Created folder: Project (7a3853f564fd477e93d834349f4b34c9) in parent group-v684465. [ 1015.751485] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Creating folder: Instances. Parent ref: group-v684533. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1015.751698] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0a0aeb98-3c96-4501-be3d-628b3ec64ac3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.760656] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Created folder: Instances in parent group-v684533. [ 1015.760862] env[68964]: DEBUG oslo.service.loopingcall [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1015.761060] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1015.761253] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1b5c24d4-3acb-405b-9437-321817d73fb3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.779802] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1015.779802] env[68964]: value = "task-3431610" [ 1015.779802] env[68964]: _type = "Task" [ 1015.779802] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1015.790325] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431610, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1016.031318] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1016.249531] env[68964]: DEBUG nova.network.neutron [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Updated VIF entry in instance network info cache for port a5adea0b-67c6-454f-ac34-63a045662c47. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1016.249908] env[68964]: DEBUG nova.network.neutron [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Updating instance_info_cache with network_info: [{"id": "a5adea0b-67c6-454f-ac34-63a045662c47", "address": "fa:16:3e:b4:94:61", "network": {"id": "796c15a7-5f1f-4673-a307-bc84a0a6caca", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-2138848524-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7a3853f564fd477e93d834349f4b34c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "be8bd197-4b2b-46e7-88ea-2554b0438584", "external-id": "nsx-vlan-transportzone-338", "segmentation_id": 338, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa5adea0b-67", "ovs_interfaceid": "a5adea0b-67c6-454f-ac34-63a045662c47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1016.259238] env[68964]: DEBUG oslo_concurrency.lockutils [req-c6472168-e3f5-4687-9bce-e12ad5224d30 req-30a7ef6b-279f-4db5-b584-a5ee0f02d77e service nova] Releasing lock "refresh_cache-7f3f326c-2127-426e-a137-6f33512f4cb2" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1016.289365] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431610, 'name': CreateVM_Task, 'duration_secs': 0.283898} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1016.289531] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1016.290255] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1016.290416] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1016.290718] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1016.291036] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac428570-3e28-48e3-a237-09d58dbcf616 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.295454] env[68964]: DEBUG oslo_vmware.api [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Waiting for the task: (returnval){ [ 1016.295454] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52fca872-26ab-c422-3e4d-c0cec09ce366" [ 1016.295454] env[68964]: _type = "Task" [ 1016.295454] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1016.303538] env[68964]: DEBUG oslo_vmware.api [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52fca872-26ab-c422-3e4d-c0cec09ce366, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1016.806136] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1016.806413] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1016.806608] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1021.724694] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1022.723923] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1022.724188] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1022.724368] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1023.724928] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1024.719551] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1024.724240] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1024.724433] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1024.735896] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1024.736148] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1024.736298] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1024.736473] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1024.739438] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d89db81-49ee-4b75-b345-8ce2a733bf34 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.748459] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc8f6e6f-90d9-4e6b-90f6-93da1933ae05 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.762819] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1b633af-090f-4514-b52f-e6676ad9f5c0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.769076] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-977a7b8a-b9e3-4439-aa3b-f63c54ff1a4c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.797977] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180895MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1024.798151] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1024.798343] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1024.869618] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.869783] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.869913] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.870050] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ff09bdbb-84e3-4182-8118-e99512a0e9de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.870216] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f37297e4-80b0-4c1d-b427-6a6a235bdc57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.870344] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.870461] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3770333e-4721-424d-ac86-2291c002e99a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.870579] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.870693] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.870807] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f3f326c-2127-426e-a137-6f33512f4cb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1024.881836] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.891654] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 238794bb-9995-4bb0-954d-7ca0ef825e19 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.900980] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8b11bda8-2923-4641-869b-39e4fce369b4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.910660] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance d7af5b42-41ea-4cac-b9ae-66ec4b6d25f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.920906] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 120c1330-9cdf-4db2-8c9f-1fa08dcad359 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.931534] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4d272615-e2dd-4540-88d0-4a209f559147 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.942510] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.954609] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 66318915-69a7-4f3a-8aa2-377948732cc5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.965811] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c076ffe-9532-4d57-b044-a74a48cb147d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.977292] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 41317213-a0f2-42fc-9e44-dfe83d27a811 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1024.990362] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 864ec33b-2840-4ed3-b0b6-2ef062141705 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1025.002654] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 57dbb1fb-8a62-4127-ac5c-4d7bf1fa2a52 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1025.012556] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c7853bb3-fa53-4911-818f-e03245ad3a0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1025.023518] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 19f90c65-2865-4fa7-b647-f69fd217e1e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1025.035637] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c1efc344-848b-4a98-a20a-57ebdfb5ac8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1025.046054] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 244140d1-bf22-415a-b770-05f2fe106149 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1025.046054] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1025.046054] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1025.323671] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9f4f2e0-2f3d-4957-bdd4-fcf32feaac23 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.331349] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31436de7-0c5e-4ba6-81d3-2bfbf13301ba {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.361707] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-747d5705-6656-4310-b40c-4142ac0f9155 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.368995] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abc41c15-9923-4724-a838-17991ae714a7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.382088] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1025.390207] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1025.403623] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1025.403814] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.605s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1026.404223] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1026.404475] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1026.404520] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1026.426088] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.426221] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.426482] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.426623] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.426747] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.426869] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.426982] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.427598] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.427598] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.427598] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1026.427598] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1026.427967] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1026.722272] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Acquiring lock "07d247db-b7ca-4b5f-818f-17411296d08f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1026.722508] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "07d247db-b7ca-4b5f-818f-17411296d08f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1049.994309] env[68964]: WARNING oslo_vmware.rw_handles [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1049.994309] env[68964]: ERROR oslo_vmware.rw_handles [ 1049.994878] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/1771a315-8f53-49e0-a3e5-3afea85b9633/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1049.997309] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1049.997591] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Copying Virtual Disk [datastore2] vmware_temp/1771a315-8f53-49e0-a3e5-3afea85b9633/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/1771a315-8f53-49e0-a3e5-3afea85b9633/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1049.998020] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2a243ad4-6e1e-4301-9c08-e7c0504cd274 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.006834] env[68964]: DEBUG oslo_vmware.api [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Waiting for the task: (returnval){ [ 1050.006834] env[68964]: value = "task-3431615" [ 1050.006834] env[68964]: _type = "Task" [ 1050.006834] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1050.015427] env[68964]: DEBUG oslo_vmware.api [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Task: {'id': task-3431615, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1050.517719] env[68964]: DEBUG oslo_vmware.exceptions [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1050.518057] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1050.518631] env[68964]: ERROR nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1050.518631] env[68964]: Faults: ['InvalidArgument'] [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Traceback (most recent call last): [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] yield resources [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] self.driver.spawn(context, instance, image_meta, [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] self._fetch_image_if_missing(context, vi) [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] image_cache(vi, tmp_image_ds_loc) [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] vm_util.copy_virtual_disk( [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] session._wait_for_task(vmdk_copy_task) [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] return self.wait_for_task(task_ref) [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] return evt.wait() [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] result = hub.switch() [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] return self.greenlet.switch() [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] self.f(*self.args, **self.kw) [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] raise exceptions.translate_fault(task_info.error) [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Faults: ['InvalidArgument'] [ 1050.518631] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] [ 1050.519531] env[68964]: INFO nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Terminating instance [ 1050.520552] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1050.520765] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1050.521013] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9ef4e7ba-1e10-46b9-811c-32fd21938a2a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.523476] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1050.523658] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1050.524391] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2f71616-2bcf-4480-b6c0-4e980e2702a3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.531416] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1050.531660] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-92807a82-2b25-4430-a105-25f560df076c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.533933] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1050.534118] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1050.535132] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dd52764c-ab67-4dd5-98d1-b474a32714a1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.540940] env[68964]: DEBUG oslo_vmware.api [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Waiting for the task: (returnval){ [ 1050.540940] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52dc5a16-8632-a722-a437-a3616b7bdd11" [ 1050.540940] env[68964]: _type = "Task" [ 1050.540940] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1050.547032] env[68964]: DEBUG oslo_vmware.api [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52dc5a16-8632-a722-a437-a3616b7bdd11, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1050.602402] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1050.602595] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1050.602813] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Deleting the datastore file [datastore2] 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1050.603095] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6c2087e5-d1ad-40bb-be14-837f1e29420d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1050.608922] env[68964]: DEBUG oslo_vmware.api [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Waiting for the task: (returnval){ [ 1050.608922] env[68964]: value = "task-3431617" [ 1050.608922] env[68964]: _type = "Task" [ 1050.608922] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1050.616790] env[68964]: DEBUG oslo_vmware.api [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Task: {'id': task-3431617, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1051.051086] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1051.051378] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Creating directory with path [datastore2] vmware_temp/8e98c2d5-b4f8-48f5-8d0a-16ba83644c38/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1051.051634] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-966375d5-e4ab-425e-baa9-6680ef81d474 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.062823] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Created directory with path [datastore2] vmware_temp/8e98c2d5-b4f8-48f5-8d0a-16ba83644c38/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1051.063021] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Fetch image to [datastore2] vmware_temp/8e98c2d5-b4f8-48f5-8d0a-16ba83644c38/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1051.063196] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/8e98c2d5-b4f8-48f5-8d0a-16ba83644c38/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1051.063901] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6727f60d-ca9f-48e7-9d67-4f95a0363fe3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.070470] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1a927f7-95f8-445f-913c-ca1db1baef78 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.079195] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f092358e-20dc-49f7-b5df-d02e4dad112d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.113099] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c090fe8-9910-4b6e-8be5-c1ece5cb128f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.120034] env[68964]: DEBUG oslo_vmware.api [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Task: {'id': task-3431617, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076977} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1051.121487] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1051.121706] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1051.121885] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1051.122072] env[68964]: INFO nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1051.123820] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-34601bd3-8a37-4306-a9a3-1081af03c229 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.125675] env[68964]: DEBUG nova.compute.claims [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1051.125845] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1051.126067] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1051.146848] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1051.202121] env[68964]: DEBUG oslo_vmware.rw_handles [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8e98c2d5-b4f8-48f5-8d0a-16ba83644c38/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1051.261185] env[68964]: DEBUG oslo_vmware.rw_handles [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1051.261365] env[68964]: DEBUG oslo_vmware.rw_handles [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8e98c2d5-b4f8-48f5-8d0a-16ba83644c38/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1051.541777] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f875898-283f-4e43-bbc8-4b60a389d252 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.548957] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e166864-bef7-4835-bd05-559c8d3dd5ec {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.586938] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-458f9be0-3e36-4221-8daa-02b731e01568 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.594327] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-970596c7-9bcf-4b8e-b8ce-018dbca207a5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1051.608603] env[68964]: DEBUG nova.compute.provider_tree [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1051.619852] env[68964]: DEBUG nova.scheduler.client.report [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1051.638718] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.513s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1051.639275] env[68964]: ERROR nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1051.639275] env[68964]: Faults: ['InvalidArgument'] [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Traceback (most recent call last): [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] self.driver.spawn(context, instance, image_meta, [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] self._fetch_image_if_missing(context, vi) [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] image_cache(vi, tmp_image_ds_loc) [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] vm_util.copy_virtual_disk( [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] session._wait_for_task(vmdk_copy_task) [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] return self.wait_for_task(task_ref) [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] return evt.wait() [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] result = hub.switch() [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] return self.greenlet.switch() [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] self.f(*self.args, **self.kw) [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] raise exceptions.translate_fault(task_info.error) [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Faults: ['InvalidArgument'] [ 1051.639275] env[68964]: ERROR nova.compute.manager [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] [ 1051.640050] env[68964]: DEBUG nova.compute.utils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1051.641449] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Build of instance 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 was re-scheduled: A specified parameter was not correct: fileType [ 1051.641449] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1051.641840] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1051.642037] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1051.642218] env[68964]: DEBUG nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1051.642381] env[68964]: DEBUG nova.network.neutron [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1051.981385] env[68964]: DEBUG nova.network.neutron [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1051.998477] env[68964]: INFO nova.compute.manager [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Took 0.35 seconds to deallocate network for instance. [ 1052.093755] env[68964]: INFO nova.scheduler.client.report [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Deleted allocations for instance 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 [ 1052.118058] env[68964]: DEBUG oslo_concurrency.lockutils [None req-efa44a84-e833-4b6d-86d1-8b55a8a06b56 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 385.884s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.119093] env[68964]: DEBUG oslo_concurrency.lockutils [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 185.491s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1052.119382] env[68964]: DEBUG oslo_concurrency.lockutils [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Acquiring lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1052.119639] env[68964]: DEBUG oslo_concurrency.lockutils [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1052.119856] env[68964]: DEBUG oslo_concurrency.lockutils [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.121872] env[68964]: INFO nova.compute.manager [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Terminating instance [ 1052.124408] env[68964]: DEBUG nova.compute.manager [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1052.124501] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1052.124944] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-474be11c-d73f-4ce4-ba6f-c1afb1fb8ad1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.134544] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a461e4-4d14-4a62-b38d-432cc561fcd2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.145499] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1052.166435] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0 could not be found. [ 1052.166894] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1052.166894] env[68964]: INFO nova.compute.manager [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1052.167074] env[68964]: DEBUG oslo.service.loopingcall [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1052.167287] env[68964]: DEBUG nova.compute.manager [-] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1052.167383] env[68964]: DEBUG nova.network.neutron [-] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1052.191974] env[68964]: DEBUG nova.network.neutron [-] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1052.193186] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1052.193483] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1052.194898] env[68964]: INFO nova.compute.claims [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1052.199029] env[68964]: INFO nova.compute.manager [-] [instance: 81b4b5ea-2bfa-4521-99f4-5ee0673afbc0] Took 0.03 seconds to deallocate network for instance. [ 1052.318102] env[68964]: DEBUG oslo_concurrency.lockutils [None req-086730a1-5106-492e-91a5-bf9f7d858dc5 tempest-VolumesAssistedSnapshotsTest-1248526325 tempest-VolumesAssistedSnapshotsTest-1248526325-project-member] Lock "81b4b5ea-2bfa-4521-99f4-5ee0673afbc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.415936] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquiring lock "7f3f326c-2127-426e-a137-6f33512f4cb2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1052.599577] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be138d48-01b3-4db8-88dc-386aa20d7230 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.605916] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6171b5ec-380a-4205-a3fd-48ee764c08bf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.638664] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2552687-d6de-4914-b136-0f3f4efd5a4a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.646554] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3433d8a7-2865-4932-8fb0-cca0a184a2ac {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.661081] env[68964]: DEBUG nova.compute.provider_tree [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1052.672497] env[68964]: DEBUG nova.scheduler.client.report [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1052.692129] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.498s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1052.692684] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1052.752312] env[68964]: DEBUG nova.compute.utils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1052.753827] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1052.754108] env[68964]: DEBUG nova.network.neutron [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1052.764920] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1052.829761] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1052.860954] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1052.861220] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1052.861376] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1052.861553] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1052.861731] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1052.861909] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1052.862139] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1052.862300] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1052.862463] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1052.862623] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1052.862788] env[68964]: DEBUG nova.virt.hardware [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1052.863645] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3987ff0-64b5-4d75-960c-1c19ef7272d9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1052.874024] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dd64154-e006-49b2-8b86-138f11e5564f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1053.046367] env[68964]: DEBUG nova.policy [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56d87ef4edc54f38b2230795d36b25a0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b0e0fedf90f941bea51c9cdc6dfd97be', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1053.492760] env[68964]: DEBUG nova.network.neutron [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Successfully created port: ae7fcece-49d9-42b1-8a48-59eecef80073 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1054.288408] env[68964]: DEBUG nova.compute.manager [req-a970e8c7-90e9-434b-b465-56cbccc276f3 req-229eba76-5c71-4954-987f-d24a27322f3a service nova] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Received event network-vif-plugged-ae7fcece-49d9-42b1-8a48-59eecef80073 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1054.288625] env[68964]: DEBUG oslo_concurrency.lockutils [req-a970e8c7-90e9-434b-b465-56cbccc276f3 req-229eba76-5c71-4954-987f-d24a27322f3a service nova] Acquiring lock "2d0469ba-ad42-4b06-ade2-cd64487278c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1054.288833] env[68964]: DEBUG oslo_concurrency.lockutils [req-a970e8c7-90e9-434b-b465-56cbccc276f3 req-229eba76-5c71-4954-987f-d24a27322f3a service nova] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1054.288996] env[68964]: DEBUG oslo_concurrency.lockutils [req-a970e8c7-90e9-434b-b465-56cbccc276f3 req-229eba76-5c71-4954-987f-d24a27322f3a service nova] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1054.289177] env[68964]: DEBUG nova.compute.manager [req-a970e8c7-90e9-434b-b465-56cbccc276f3 req-229eba76-5c71-4954-987f-d24a27322f3a service nova] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] No waiting events found dispatching network-vif-plugged-ae7fcece-49d9-42b1-8a48-59eecef80073 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1054.289842] env[68964]: WARNING nova.compute.manager [req-a970e8c7-90e9-434b-b465-56cbccc276f3 req-229eba76-5c71-4954-987f-d24a27322f3a service nova] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Received unexpected event network-vif-plugged-ae7fcece-49d9-42b1-8a48-59eecef80073 for instance with vm_state building and task_state spawning. [ 1054.330354] env[68964]: DEBUG nova.network.neutron [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Successfully updated port: ae7fcece-49d9-42b1-8a48-59eecef80073 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1054.341833] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquiring lock "refresh_cache-2d0469ba-ad42-4b06-ade2-cd64487278c5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1054.342036] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquired lock "refresh_cache-2d0469ba-ad42-4b06-ade2-cd64487278c5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1054.342189] env[68964]: DEBUG nova.network.neutron [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1054.412979] env[68964]: DEBUG nova.network.neutron [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1054.639453] env[68964]: DEBUG nova.network.neutron [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Updating instance_info_cache with network_info: [{"id": "ae7fcece-49d9-42b1-8a48-59eecef80073", "address": "fa:16:3e:29:52:62", "network": {"id": "97efec80-6369-49ce-b822-8b7d9c4411ac", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1595213020-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b0e0fedf90f941bea51c9cdc6dfd97be", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "02bbcead-d833-4543-bec6-fb82dfe659ff", "external-id": "nsx-vlan-transportzone-478", "segmentation_id": 478, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae7fcece-49", "ovs_interfaceid": "ae7fcece-49d9-42b1-8a48-59eecef80073", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1054.654346] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Releasing lock "refresh_cache-2d0469ba-ad42-4b06-ade2-cd64487278c5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1054.654645] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Instance network_info: |[{"id": "ae7fcece-49d9-42b1-8a48-59eecef80073", "address": "fa:16:3e:29:52:62", "network": {"id": "97efec80-6369-49ce-b822-8b7d9c4411ac", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1595213020-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b0e0fedf90f941bea51c9cdc6dfd97be", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "02bbcead-d833-4543-bec6-fb82dfe659ff", "external-id": "nsx-vlan-transportzone-478", "segmentation_id": 478, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae7fcece-49", "ovs_interfaceid": "ae7fcece-49d9-42b1-8a48-59eecef80073", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1054.655071] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:29:52:62', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '02bbcead-d833-4543-bec6-fb82dfe659ff', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ae7fcece-49d9-42b1-8a48-59eecef80073', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1054.663291] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Creating folder: Project (b0e0fedf90f941bea51c9cdc6dfd97be). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1054.663829] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4f98e1d3-7021-44a5-bd7b-ddf32bf72005 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1054.675085] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Created folder: Project (b0e0fedf90f941bea51c9cdc6dfd97be) in parent group-v684465. [ 1054.675280] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Creating folder: Instances. Parent ref: group-v684537. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1054.675497] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-541e11bb-b3b2-4c4c-aff4-ec3a34e2a318 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1054.683971] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Created folder: Instances in parent group-v684537. [ 1054.684237] env[68964]: DEBUG oslo.service.loopingcall [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1054.684436] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1054.684636] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1d07eb10-a5f9-4a93-a212-025a904477ea {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1054.705036] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1054.705036] env[68964]: value = "task-3431620" [ 1054.705036] env[68964]: _type = "Task" [ 1054.705036] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1054.713973] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431620, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1055.218476] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431620, 'name': CreateVM_Task, 'duration_secs': 0.295711} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1055.218716] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1055.219658] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1055.219903] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1055.220348] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1055.220690] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0a03695a-e180-46b7-9178-9ac4a3b01d62 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1055.226881] env[68964]: DEBUG oslo_vmware.api [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Waiting for the task: (returnval){ [ 1055.226881] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]526ccbde-5946-a8f5-39a7-bac04a4cab3b" [ 1055.226881] env[68964]: _type = "Task" [ 1055.226881] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1055.240339] env[68964]: DEBUG oslo_vmware.api [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]526ccbde-5946-a8f5-39a7-bac04a4cab3b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1055.739876] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1055.740174] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1055.740384] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1056.333087] env[68964]: DEBUG nova.compute.manager [req-e37d8172-652b-42b4-b5a1-00af0e21f255 req-8367c3a0-db0f-4aa5-9ce9-20dc39b45bb7 service nova] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Received event network-changed-ae7fcece-49d9-42b1-8a48-59eecef80073 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1056.333170] env[68964]: DEBUG nova.compute.manager [req-e37d8172-652b-42b4-b5a1-00af0e21f255 req-8367c3a0-db0f-4aa5-9ce9-20dc39b45bb7 service nova] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Refreshing instance network info cache due to event network-changed-ae7fcece-49d9-42b1-8a48-59eecef80073. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1056.333361] env[68964]: DEBUG oslo_concurrency.lockutils [req-e37d8172-652b-42b4-b5a1-00af0e21f255 req-8367c3a0-db0f-4aa5-9ce9-20dc39b45bb7 service nova] Acquiring lock "refresh_cache-2d0469ba-ad42-4b06-ade2-cd64487278c5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1056.333489] env[68964]: DEBUG oslo_concurrency.lockutils [req-e37d8172-652b-42b4-b5a1-00af0e21f255 req-8367c3a0-db0f-4aa5-9ce9-20dc39b45bb7 service nova] Acquired lock "refresh_cache-2d0469ba-ad42-4b06-ade2-cd64487278c5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1056.333642] env[68964]: DEBUG nova.network.neutron [req-e37d8172-652b-42b4-b5a1-00af0e21f255 req-8367c3a0-db0f-4aa5-9ce9-20dc39b45bb7 service nova] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Refreshing network info cache for port ae7fcece-49d9-42b1-8a48-59eecef80073 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1056.660055] env[68964]: DEBUG nova.network.neutron [req-e37d8172-652b-42b4-b5a1-00af0e21f255 req-8367c3a0-db0f-4aa5-9ce9-20dc39b45bb7 service nova] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Updated VIF entry in instance network info cache for port ae7fcece-49d9-42b1-8a48-59eecef80073. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1056.662852] env[68964]: DEBUG nova.network.neutron [req-e37d8172-652b-42b4-b5a1-00af0e21f255 req-8367c3a0-db0f-4aa5-9ce9-20dc39b45bb7 service nova] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Updating instance_info_cache with network_info: [{"id": "ae7fcece-49d9-42b1-8a48-59eecef80073", "address": "fa:16:3e:29:52:62", "network": {"id": "97efec80-6369-49ce-b822-8b7d9c4411ac", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1595213020-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "b0e0fedf90f941bea51c9cdc6dfd97be", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "02bbcead-d833-4543-bec6-fb82dfe659ff", "external-id": "nsx-vlan-transportzone-478", "segmentation_id": 478, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapae7fcece-49", "ovs_interfaceid": "ae7fcece-49d9-42b1-8a48-59eecef80073", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1056.675650] env[68964]: DEBUG oslo_concurrency.lockutils [req-e37d8172-652b-42b4-b5a1-00af0e21f255 req-8367c3a0-db0f-4aa5-9ce9-20dc39b45bb7 service nova] Releasing lock "refresh_cache-2d0469ba-ad42-4b06-ade2-cd64487278c5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1059.998346] env[68964]: WARNING oslo_vmware.rw_handles [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1059.998346] env[68964]: ERROR oslo_vmware.rw_handles [ 1059.999040] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/adec52ee-1e89-47ef-b518-259610daf9e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1060.004021] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1060.004021] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Copying Virtual Disk [datastore1] vmware_temp/adec52ee-1e89-47ef-b518-259610daf9e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/adec52ee-1e89-47ef-b518-259610daf9e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1060.004021] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-aa3b8b99-60f1-4f46-80b8-fe4203baa8cc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.009870] env[68964]: DEBUG oslo_vmware.api [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Waiting for the task: (returnval){ [ 1060.009870] env[68964]: value = "task-3431621" [ 1060.009870] env[68964]: _type = "Task" [ 1060.009870] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1060.018394] env[68964]: DEBUG oslo_vmware.api [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Task: {'id': task-3431621, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1060.520948] env[68964]: DEBUG oslo_vmware.exceptions [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1060.521264] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1060.521822] env[68964]: ERROR nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1060.521822] env[68964]: Faults: ['InvalidArgument'] [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Traceback (most recent call last): [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] yield resources [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] self.driver.spawn(context, instance, image_meta, [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] self._fetch_image_if_missing(context, vi) [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] image_cache(vi, tmp_image_ds_loc) [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] vm_util.copy_virtual_disk( [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] session._wait_for_task(vmdk_copy_task) [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] return self.wait_for_task(task_ref) [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] return evt.wait() [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] result = hub.switch() [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] return self.greenlet.switch() [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] self.f(*self.args, **self.kw) [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] raise exceptions.translate_fault(task_info.error) [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Faults: ['InvalidArgument'] [ 1060.521822] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] [ 1060.522731] env[68964]: INFO nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Terminating instance [ 1060.523818] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1060.527710] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1060.527710] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "refresh_cache-f37297e4-80b0-4c1d-b427-6a6a235bdc57" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1060.527710] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquired lock "refresh_cache-f37297e4-80b0-4c1d-b427-6a6a235bdc57" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1060.527710] env[68964]: DEBUG nova.network.neutron [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1060.527710] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a8663d1f-0ded-469b-8e8f-f9b56a068617 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.535074] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1060.535262] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1060.535965] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-49600f41-b262-44d4-80b9-cdb3fba0c6ea {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.543953] env[68964]: DEBUG oslo_vmware.api [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Waiting for the task: (returnval){ [ 1060.543953] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52689f4a-8d2e-ec1c-c18f-9ff77f7d4025" [ 1060.543953] env[68964]: _type = "Task" [ 1060.543953] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1060.555027] env[68964]: DEBUG oslo_vmware.api [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52689f4a-8d2e-ec1c-c18f-9ff77f7d4025, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1060.558407] env[68964]: DEBUG nova.network.neutron [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1060.630530] env[68964]: DEBUG nova.network.neutron [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1060.639861] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Releasing lock "refresh_cache-f37297e4-80b0-4c1d-b427-6a6a235bdc57" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1060.640287] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1060.640477] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1060.641743] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-311fe550-4810-4c10-b33c-696f66a35ca5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.654562] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1060.654811] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9aa08405-35c2-4df7-b2cb-063a2580d55a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.680097] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1060.680440] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1060.680674] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Deleting the datastore file [datastore1] f37297e4-80b0-4c1d-b427-6a6a235bdc57 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1060.680929] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-81c8eba8-19aa-463d-9673-641b07ae89e5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.687122] env[68964]: DEBUG oslo_vmware.api [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Waiting for the task: (returnval){ [ 1060.687122] env[68964]: value = "task-3431623" [ 1060.687122] env[68964]: _type = "Task" [ 1060.687122] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1060.699676] env[68964]: DEBUG oslo_vmware.api [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Task: {'id': task-3431623, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1061.055037] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1061.055037] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Creating directory with path [datastore1] vmware_temp/7d9d2bba-1b55-47b9-a7b7-3145f7fafd46/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1061.055037] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7b71e5c0-eeee-4e8e-8280-5a81f4672652 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.069021] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Created directory with path [datastore1] vmware_temp/7d9d2bba-1b55-47b9-a7b7-3145f7fafd46/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1061.069021] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Fetch image to [datastore1] vmware_temp/7d9d2bba-1b55-47b9-a7b7-3145f7fafd46/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1061.069021] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/7d9d2bba-1b55-47b9-a7b7-3145f7fafd46/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1061.069021] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfffabf4-7864-4b78-8dfe-272883427e35 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.074754] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8120e407-7d88-46bb-baa1-9b7a32a90e2c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.085111] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e2d151f-da4c-476a-904d-05ef64558aeb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.117801] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08e65018-48cd-4aa6-bdd3-67f7cbdebaab {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.123105] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a1296f3b-5992-4eed-98a9-8b23d88fe0dc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.142873] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1061.195901] env[68964]: DEBUG oslo_vmware.api [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Task: {'id': task-3431623, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.044249} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1061.197057] env[68964]: DEBUG oslo_vmware.rw_handles [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7d9d2bba-1b55-47b9-a7b7-3145f7fafd46/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1061.198361] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1061.198552] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1061.198725] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1061.198898] env[68964]: INFO nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1061.199161] env[68964]: DEBUG oslo.service.loopingcall [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1061.254417] env[68964]: DEBUG nova.compute.manager [-] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Skipping network deallocation for instance since networking was not requested. {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1061.257067] env[68964]: DEBUG nova.compute.claims [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1061.257247] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.257474] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1061.261868] env[68964]: DEBUG oslo_vmware.rw_handles [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1061.262043] env[68964]: DEBUG oslo_vmware.rw_handles [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7d9d2bba-1b55-47b9-a7b7-3145f7fafd46/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1061.661241] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ed39075-8502-4308-99d1-6340d2da3b4d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.668878] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8108ee9f-0eb2-494c-8ea1-c9f128ac3292 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.700049] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81525400-10ae-42f3-b625-f32cffa3b594 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.708336] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-009e52ba-b2f4-4e1f-88d7-ca83ca8f7cb2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.722033] env[68964]: DEBUG nova.compute.provider_tree [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1061.730132] env[68964]: DEBUG nova.scheduler.client.report [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1061.746416] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.489s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1061.747016] env[68964]: ERROR nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1061.747016] env[68964]: Faults: ['InvalidArgument'] [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Traceback (most recent call last): [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] self.driver.spawn(context, instance, image_meta, [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] self._fetch_image_if_missing(context, vi) [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] image_cache(vi, tmp_image_ds_loc) [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] vm_util.copy_virtual_disk( [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] session._wait_for_task(vmdk_copy_task) [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] return self.wait_for_task(task_ref) [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] return evt.wait() [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] result = hub.switch() [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] return self.greenlet.switch() [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] self.f(*self.args, **self.kw) [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] raise exceptions.translate_fault(task_info.error) [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Faults: ['InvalidArgument'] [ 1061.747016] env[68964]: ERROR nova.compute.manager [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] [ 1061.747760] env[68964]: DEBUG nova.compute.utils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1061.749416] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Build of instance f37297e4-80b0-4c1d-b427-6a6a235bdc57 was re-scheduled: A specified parameter was not correct: fileType [ 1061.749416] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1061.749781] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1061.750008] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "refresh_cache-f37297e4-80b0-4c1d-b427-6a6a235bdc57" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1061.750203] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquired lock "refresh_cache-f37297e4-80b0-4c1d-b427-6a6a235bdc57" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1061.750648] env[68964]: DEBUG nova.network.neutron [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1061.778078] env[68964]: DEBUG nova.network.neutron [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1061.885171] env[68964]: DEBUG nova.network.neutron [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1061.894970] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Releasing lock "refresh_cache-f37297e4-80b0-4c1d-b427-6a6a235bdc57" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1061.895304] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1061.895559] env[68964]: DEBUG nova.compute.manager [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Skipping network deallocation for instance since networking was not requested. {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1061.988220] env[68964]: INFO nova.scheduler.client.report [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Deleted allocations for instance f37297e4-80b0-4c1d-b427-6a6a235bdc57 [ 1062.008483] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0be92b38-0266-4211-8727-467bc6efe26e tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "f37297e4-80b0-4c1d-b427-6a6a235bdc57" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 354.029s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1062.009645] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "f37297e4-80b0-4c1d-b427-6a6a235bdc57" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 151.701s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1062.009829] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "f37297e4-80b0-4c1d-b427-6a6a235bdc57-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1062.010042] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "f37297e4-80b0-4c1d-b427-6a6a235bdc57-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1062.010214] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "f37297e4-80b0-4c1d-b427-6a6a235bdc57-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1062.012204] env[68964]: INFO nova.compute.manager [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Terminating instance [ 1062.014153] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquiring lock "refresh_cache-f37297e4-80b0-4c1d-b427-6a6a235bdc57" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1062.014340] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Acquired lock "refresh_cache-f37297e4-80b0-4c1d-b427-6a6a235bdc57" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1062.014498] env[68964]: DEBUG nova.network.neutron [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1062.025020] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1062.057028] env[68964]: DEBUG nova.network.neutron [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1062.086141] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1062.086410] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1062.087885] env[68964]: INFO nova.compute.claims [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1062.346964] env[68964]: DEBUG nova.network.neutron [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1062.360970] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Releasing lock "refresh_cache-f37297e4-80b0-4c1d-b427-6a6a235bdc57" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1062.362074] env[68964]: DEBUG nova.compute.manager [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1062.362074] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1062.362540] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3f5bd082-7436-4e43-b0d6-19cfe97742d1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.376877] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34bf6379-033b-467e-a4e8-a92f1e63fc18 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.409288] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f37297e4-80b0-4c1d-b427-6a6a235bdc57 could not be found. [ 1062.409508] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1062.409690] env[68964]: INFO nova.compute.manager [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1062.409937] env[68964]: DEBUG oslo.service.loopingcall [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1062.412588] env[68964]: DEBUG nova.compute.manager [-] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1062.412697] env[68964]: DEBUG nova.network.neutron [-] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1062.432321] env[68964]: DEBUG nova.network.neutron [-] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1062.441863] env[68964]: DEBUG nova.network.neutron [-] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1062.452677] env[68964]: INFO nova.compute.manager [-] [instance: f37297e4-80b0-4c1d-b427-6a6a235bdc57] Took 0.04 seconds to deallocate network for instance. [ 1062.548228] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f158eab-09ff-4571-8f14-f1d50aeac0cb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.557115] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c7df253-eff0-4aee-ba6e-2c8504daf9c1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.594586] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7ab06fc-5c45-4bfb-a0e5-0e40ce81af62 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.597436] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5649d479-4990-4620-957c-77fdbdf32fd2 tempest-ServerShowV247Test-584482267 tempest-ServerShowV247Test-584482267-project-member] Lock "f37297e4-80b0-4c1d-b427-6a6a235bdc57" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.588s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1062.603859] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3be7300-cd15-4a59-b79d-b7e7b04e0563 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.620438] env[68964]: DEBUG nova.compute.provider_tree [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1062.628743] env[68964]: DEBUG nova.scheduler.client.report [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1062.646146] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.560s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1062.646620] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1062.688049] env[68964]: DEBUG nova.compute.utils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1062.690275] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1062.690495] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1062.705417] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1062.788612] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1062.808813] env[68964]: DEBUG nova.policy [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e50c1dfee98641178c82e45bb73036c2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fff27285287e45bd9d3431b9528f8ad2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1062.820590] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1062.820826] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1062.820983] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1062.821588] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1062.821749] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1062.821909] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1062.822138] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1062.822295] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1062.822460] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1062.822623] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1062.822801] env[68964]: DEBUG nova.virt.hardware [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1062.823953] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ada9818-743e-42d4-9817-1a87f924c5cb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.832911] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04c98567-7282-4536-b18b-ccbd5d683c20 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.305267] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Successfully created port: 44979cbf-3cd7-4fb1-a61f-4b38a26f6981 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1063.977388] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Successfully created port: 77292e6d-8936-42b5-89f3-c198d5db3f91 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1065.735500] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Successfully updated port: 44979cbf-3cd7-4fb1-a61f-4b38a26f6981 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1065.741643] env[68964]: DEBUG nova.compute.manager [req-ba94b4eb-0c02-42ca-951a-1310e52cee99 req-878f4e35-86fe-4859-a5f3-87106f969513 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Received event network-vif-plugged-44979cbf-3cd7-4fb1-a61f-4b38a26f6981 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1065.741903] env[68964]: DEBUG oslo_concurrency.lockutils [req-ba94b4eb-0c02-42ca-951a-1310e52cee99 req-878f4e35-86fe-4859-a5f3-87106f969513 service nova] Acquiring lock "238794bb-9995-4bb0-954d-7ca0ef825e19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1065.742144] env[68964]: DEBUG oslo_concurrency.lockutils [req-ba94b4eb-0c02-42ca-951a-1310e52cee99 req-878f4e35-86fe-4859-a5f3-87106f969513 service nova] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1065.743572] env[68964]: DEBUG oslo_concurrency.lockutils [req-ba94b4eb-0c02-42ca-951a-1310e52cee99 req-878f4e35-86fe-4859-a5f3-87106f969513 service nova] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1065.743572] env[68964]: DEBUG nova.compute.manager [req-ba94b4eb-0c02-42ca-951a-1310e52cee99 req-878f4e35-86fe-4859-a5f3-87106f969513 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] No waiting events found dispatching network-vif-plugged-44979cbf-3cd7-4fb1-a61f-4b38a26f6981 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1065.743572] env[68964]: WARNING nova.compute.manager [req-ba94b4eb-0c02-42ca-951a-1310e52cee99 req-878f4e35-86fe-4859-a5f3-87106f969513 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Received unexpected event network-vif-plugged-44979cbf-3cd7-4fb1-a61f-4b38a26f6981 for instance with vm_state building and task_state spawning. [ 1067.393887] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Successfully updated port: 77292e6d-8936-42b5-89f3-c198d5db3f91 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1067.411035] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquiring lock "refresh_cache-238794bb-9995-4bb0-954d-7ca0ef825e19" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1067.411035] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquired lock "refresh_cache-238794bb-9995-4bb0-954d-7ca0ef825e19" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1067.411202] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1067.484140] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1068.041520] env[68964]: DEBUG nova.compute.manager [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Received event network-changed-44979cbf-3cd7-4fb1-a61f-4b38a26f6981 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1068.041721] env[68964]: DEBUG nova.compute.manager [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Refreshing instance network info cache due to event network-changed-44979cbf-3cd7-4fb1-a61f-4b38a26f6981. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1068.041924] env[68964]: DEBUG oslo_concurrency.lockutils [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] Acquiring lock "refresh_cache-238794bb-9995-4bb0-954d-7ca0ef825e19" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1068.131383] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Updating instance_info_cache with network_info: [{"id": "44979cbf-3cd7-4fb1-a61f-4b38a26f6981", "address": "fa:16:3e:30:06:c1", "network": {"id": "d8b29824-fa3a-485f-8693-cf9540b99b85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-774059152", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.231", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fff27285287e45bd9d3431b9528f8ad2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0e5087-d65b-416f-90fe-beaa9c534ad3", "external-id": "nsx-vlan-transportzone-522", "segmentation_id": 522, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44979cbf-3c", "ovs_interfaceid": "44979cbf-3cd7-4fb1-a61f-4b38a26f6981", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77292e6d-8936-42b5-89f3-c198d5db3f91", "address": "fa:16:3e:19:1c:03", "network": {"id": "6c97522b-0c13-42da-9229-2b361e1d99fa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1452957798", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "fff27285287e45bd9d3431b9528f8ad2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d70692eb-97b3-417c-a4ca-1ee888246ad9", "external-id": "nsx-vlan-transportzone-342", "segmentation_id": 342, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77292e6d-89", "ovs_interfaceid": "77292e6d-8936-42b5-89f3-c198d5db3f91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1068.159028] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Releasing lock "refresh_cache-238794bb-9995-4bb0-954d-7ca0ef825e19" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1068.159028] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Instance network_info: |[{"id": "44979cbf-3cd7-4fb1-a61f-4b38a26f6981", "address": "fa:16:3e:30:06:c1", "network": {"id": "d8b29824-fa3a-485f-8693-cf9540b99b85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-774059152", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.231", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fff27285287e45bd9d3431b9528f8ad2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0e5087-d65b-416f-90fe-beaa9c534ad3", "external-id": "nsx-vlan-transportzone-522", "segmentation_id": 522, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44979cbf-3c", "ovs_interfaceid": "44979cbf-3cd7-4fb1-a61f-4b38a26f6981", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77292e6d-8936-42b5-89f3-c198d5db3f91", "address": "fa:16:3e:19:1c:03", "network": {"id": "6c97522b-0c13-42da-9229-2b361e1d99fa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1452957798", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "fff27285287e45bd9d3431b9528f8ad2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d70692eb-97b3-417c-a4ca-1ee888246ad9", "external-id": "nsx-vlan-transportzone-342", "segmentation_id": 342, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77292e6d-89", "ovs_interfaceid": "77292e6d-8936-42b5-89f3-c198d5db3f91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1068.159028] env[68964]: DEBUG oslo_concurrency.lockutils [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] Acquired lock "refresh_cache-238794bb-9995-4bb0-954d-7ca0ef825e19" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1068.159028] env[68964]: DEBUG nova.network.neutron [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Refreshing network info cache for port 44979cbf-3cd7-4fb1-a61f-4b38a26f6981 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1068.160223] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:30:06:c1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'da0e5087-d65b-416f-90fe-beaa9c534ad3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '44979cbf-3cd7-4fb1-a61f-4b38a26f6981', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:19:1c:03', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd70692eb-97b3-417c-a4ca-1ee888246ad9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '77292e6d-8936-42b5-89f3-c198d5db3f91', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1068.173326] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Creating folder: Project (fff27285287e45bd9d3431b9528f8ad2). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1068.178971] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4d50c22e-97a5-45b4-8fda-4d505a9da224 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.192626] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Created folder: Project (fff27285287e45bd9d3431b9528f8ad2) in parent group-v684465. [ 1068.192626] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Creating folder: Instances. Parent ref: group-v684540. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1068.192626] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a9f03c64-8d8f-4b75-80b5-e0392c589ba1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.202718] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Created folder: Instances in parent group-v684540. [ 1068.203026] env[68964]: DEBUG oslo.service.loopingcall [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1068.203263] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1068.203501] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-48f27154-9ffe-46a8-b04d-ef5009ab5a34 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.228390] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1068.228390] env[68964]: value = "task-3431626" [ 1068.228390] env[68964]: _type = "Task" [ 1068.228390] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1068.239142] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431626, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1068.739464] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431626, 'name': CreateVM_Task, 'duration_secs': 0.304784} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1068.739776] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1068.744106] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1068.744290] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1068.744602] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1068.744869] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-352a076e-bc81-4cd0-a18e-51d3fc1f2d7b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1068.749550] env[68964]: DEBUG oslo_vmware.api [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Waiting for the task: (returnval){ [ 1068.749550] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52cd3c96-7c24-38a6-e6db-2281b8c96580" [ 1068.749550] env[68964]: _type = "Task" [ 1068.749550] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1068.757559] env[68964]: DEBUG oslo_vmware.api [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52cd3c96-7c24-38a6-e6db-2281b8c96580, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1068.778123] env[68964]: DEBUG nova.network.neutron [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Updated VIF entry in instance network info cache for port 44979cbf-3cd7-4fb1-a61f-4b38a26f6981. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1068.778123] env[68964]: DEBUG nova.network.neutron [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Updating instance_info_cache with network_info: [{"id": "44979cbf-3cd7-4fb1-a61f-4b38a26f6981", "address": "fa:16:3e:30:06:c1", "network": {"id": "d8b29824-fa3a-485f-8693-cf9540b99b85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-774059152", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.231", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fff27285287e45bd9d3431b9528f8ad2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0e5087-d65b-416f-90fe-beaa9c534ad3", "external-id": "nsx-vlan-transportzone-522", "segmentation_id": 522, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44979cbf-3c", "ovs_interfaceid": "44979cbf-3cd7-4fb1-a61f-4b38a26f6981", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77292e6d-8936-42b5-89f3-c198d5db3f91", "address": "fa:16:3e:19:1c:03", "network": {"id": "6c97522b-0c13-42da-9229-2b361e1d99fa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1452957798", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "fff27285287e45bd9d3431b9528f8ad2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d70692eb-97b3-417c-a4ca-1ee888246ad9", "external-id": "nsx-vlan-transportzone-342", "segmentation_id": 342, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77292e6d-89", "ovs_interfaceid": "77292e6d-8936-42b5-89f3-c198d5db3f91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1068.797591] env[68964]: DEBUG oslo_concurrency.lockutils [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] Releasing lock "refresh_cache-238794bb-9995-4bb0-954d-7ca0ef825e19" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1068.799038] env[68964]: DEBUG nova.compute.manager [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Received event network-vif-plugged-77292e6d-8936-42b5-89f3-c198d5db3f91 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1068.799038] env[68964]: DEBUG oslo_concurrency.lockutils [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] Acquiring lock "238794bb-9995-4bb0-954d-7ca0ef825e19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1068.799038] env[68964]: DEBUG oslo_concurrency.lockutils [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1068.799038] env[68964]: DEBUG oslo_concurrency.lockutils [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1068.799038] env[68964]: DEBUG nova.compute.manager [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] No waiting events found dispatching network-vif-plugged-77292e6d-8936-42b5-89f3-c198d5db3f91 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1068.799404] env[68964]: WARNING nova.compute.manager [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Received unexpected event network-vif-plugged-77292e6d-8936-42b5-89f3-c198d5db3f91 for instance with vm_state building and task_state spawning. [ 1068.799404] env[68964]: DEBUG nova.compute.manager [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Received event network-changed-77292e6d-8936-42b5-89f3-c198d5db3f91 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1068.799508] env[68964]: DEBUG nova.compute.manager [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Refreshing instance network info cache due to event network-changed-77292e6d-8936-42b5-89f3-c198d5db3f91. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1068.799625] env[68964]: DEBUG oslo_concurrency.lockutils [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] Acquiring lock "refresh_cache-238794bb-9995-4bb0-954d-7ca0ef825e19" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1068.799758] env[68964]: DEBUG oslo_concurrency.lockutils [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] Acquired lock "refresh_cache-238794bb-9995-4bb0-954d-7ca0ef825e19" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1068.799905] env[68964]: DEBUG nova.network.neutron [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Refreshing network info cache for port 77292e6d-8936-42b5-89f3-c198d5db3f91 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1069.157853] env[68964]: DEBUG nova.network.neutron [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Updated VIF entry in instance network info cache for port 77292e6d-8936-42b5-89f3-c198d5db3f91. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1069.158337] env[68964]: DEBUG nova.network.neutron [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Updating instance_info_cache with network_info: [{"id": "44979cbf-3cd7-4fb1-a61f-4b38a26f6981", "address": "fa:16:3e:30:06:c1", "network": {"id": "d8b29824-fa3a-485f-8693-cf9540b99b85", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-774059152", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.231", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fff27285287e45bd9d3431b9528f8ad2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0e5087-d65b-416f-90fe-beaa9c534ad3", "external-id": "nsx-vlan-transportzone-522", "segmentation_id": 522, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap44979cbf-3c", "ovs_interfaceid": "44979cbf-3cd7-4fb1-a61f-4b38a26f6981", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "77292e6d-8936-42b5-89f3-c198d5db3f91", "address": "fa:16:3e:19:1c:03", "network": {"id": "6c97522b-0c13-42da-9229-2b361e1d99fa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1452957798", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.53", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "fff27285287e45bd9d3431b9528f8ad2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d70692eb-97b3-417c-a4ca-1ee888246ad9", "external-id": "nsx-vlan-transportzone-342", "segmentation_id": 342, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77292e6d-89", "ovs_interfaceid": "77292e6d-8936-42b5-89f3-c198d5db3f91", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1069.173586] env[68964]: DEBUG oslo_concurrency.lockutils [req-7f59ab58-0a71-40ac-86f2-556535e3ae70 req-a542d7c2-77d2-4e02-bba4-eee33fbe4903 service nova] Releasing lock "refresh_cache-238794bb-9995-4bb0-954d-7ca0ef825e19" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1069.262552] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1069.262812] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1069.263032] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1069.743352] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquiring lock "2d0469ba-ad42-4b06-ade2-cd64487278c5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1070.668453] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquiring lock "96c1b70b-9a17-46b1-999d-558b85c77d22" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1070.668453] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1072.784304] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "a317d842-0282-4ace-a457-d8031cf0adca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1072.784304] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "a317d842-0282-4ace-a457-d8031cf0adca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1073.554766] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "8ee9e517-075e-4faf-9740-32f8fa585eb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1073.554990] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "8ee9e517-075e-4faf-9740-32f8fa585eb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1082.724584] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1082.756794] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1082.756998] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1084.725019] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1084.725325] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1084.725455] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1084.725677] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1084.725746] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1085.335939] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquiring lock "238794bb-9995-4bb0-954d-7ca0ef825e19" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1086.152099] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "704ec14b-410e-4175-b032-69074b332d87" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1086.152526] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "704ec14b-410e-4175-b032-69074b332d87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1086.723897] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1086.724931] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1086.724931] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1086.746440] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.746614] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.746748] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.747168] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.747168] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.747168] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.747301] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.747364] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.747461] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.747580] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1086.747700] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1086.748228] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1086.760047] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1086.760277] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1086.760442] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1086.760622] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1086.761737] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8efc1510-c154-4ea3-9d2f-197dc5a569f0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1086.770554] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a58986e4-d2c8-479e-8da5-6e6179884893 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1086.784430] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c661ad26-feae-4e33-80a6-b2aaef19e143 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1086.791506] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52812bcb-165f-49e1-947b-baa50388be1d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1086.822740] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180946MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1086.822740] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1086.822876] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1086.894614] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.894807] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.894941] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ff09bdbb-84e3-4182-8118-e99512a0e9de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.895078] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.895203] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3770333e-4721-424d-ac86-2291c002e99a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.895323] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.895440] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.895585] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f3f326c-2127-426e-a137-6f33512f4cb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.895706] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.895818] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 238794bb-9995-4bb0-954d-7ca0ef825e19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1086.906821] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance d7af5b42-41ea-4cac-b9ae-66ec4b6d25f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1086.918575] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 120c1330-9cdf-4db2-8c9f-1fa08dcad359 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1086.928926] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4d272615-e2dd-4540-88d0-4a209f559147 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1086.941192] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1086.952656] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 66318915-69a7-4f3a-8aa2-377948732cc5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1086.962787] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 5c076ffe-9532-4d57-b044-a74a48cb147d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1086.972934] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 41317213-a0f2-42fc-9e44-dfe83d27a811 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1086.982154] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 864ec33b-2840-4ed3-b0b6-2ef062141705 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1086.991409] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 57dbb1fb-8a62-4127-ac5c-4d7bf1fa2a52 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.001623] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c7853bb3-fa53-4911-818f-e03245ad3a0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.011048] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 19f90c65-2865-4fa7-b647-f69fd217e1e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.020607] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c1efc344-848b-4a98-a20a-57ebdfb5ac8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.029774] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 244140d1-bf22-415a-b770-05f2fe106149 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.042762] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07d247db-b7ca-4b5f-818f-17411296d08f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.052731] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96c1b70b-9a17-46b1-999d-558b85c77d22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.063026] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a317d842-0282-4ace-a457-d8031cf0adca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.071900] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.082700] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 704ec14b-410e-4175-b032-69074b332d87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1087.082700] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1087.082700] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1087.470182] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e41abd7c-d9dd-4992-8b36-dc3291cb5ff6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1087.477967] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6eb7c64-b4b5-4d3b-be98-71868825049b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1087.511544] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbb6319b-33c9-4301-b7ec-4320b6e47b3d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1087.520237] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eab290af-f604-4b19-aa6e-af616bf90d64 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1087.537611] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1087.547311] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1087.562451] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1087.562774] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.740s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1088.539741] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1098.825766] env[68964]: WARNING oslo_vmware.rw_handles [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1098.825766] env[68964]: ERROR oslo_vmware.rw_handles [ 1098.826351] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/8e98c2d5-b4f8-48f5-8d0a-16ba83644c38/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1098.828900] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1098.828900] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Copying Virtual Disk [datastore2] vmware_temp/8e98c2d5-b4f8-48f5-8d0a-16ba83644c38/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/8e98c2d5-b4f8-48f5-8d0a-16ba83644c38/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1098.828900] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e5de9e69-a794-4486-98be-277aff7261d7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1098.837125] env[68964]: DEBUG oslo_vmware.api [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Waiting for the task: (returnval){ [ 1098.837125] env[68964]: value = "task-3431627" [ 1098.837125] env[68964]: _type = "Task" [ 1098.837125] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1098.845449] env[68964]: DEBUG oslo_vmware.api [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Task: {'id': task-3431627, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1099.347768] env[68964]: DEBUG oslo_vmware.exceptions [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1099.348142] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1099.348756] env[68964]: ERROR nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1099.348756] env[68964]: Faults: ['InvalidArgument'] [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Traceback (most recent call last): [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] yield resources [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] self.driver.spawn(context, instance, image_meta, [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] self._fetch_image_if_missing(context, vi) [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] image_cache(vi, tmp_image_ds_loc) [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] vm_util.copy_virtual_disk( [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] session._wait_for_task(vmdk_copy_task) [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] return self.wait_for_task(task_ref) [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] return evt.wait() [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] result = hub.switch() [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] return self.greenlet.switch() [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] self.f(*self.args, **self.kw) [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] raise exceptions.translate_fault(task_info.error) [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Faults: ['InvalidArgument'] [ 1099.348756] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] [ 1099.349693] env[68964]: INFO nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Terminating instance [ 1099.350829] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1099.351073] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1099.351746] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1099.351943] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1099.352206] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-eb447f82-d0fe-417f-ae3b-49f73e07d336 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.355370] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-947d98b1-7764-416d-85d3-3af997226cc7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.362820] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1099.363084] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ca05e5c1-7692-41f4-8bad-40c3d7068ebb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.365528] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1099.365702] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1099.366846] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4b8bfe36-64ac-4c55-8b57-abfe03ef74ce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.376100] env[68964]: DEBUG oslo_vmware.api [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Waiting for the task: (returnval){ [ 1099.376100] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]522b7b0d-86b1-7c28-01d6-7567d602d347" [ 1099.376100] env[68964]: _type = "Task" [ 1099.376100] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1099.386305] env[68964]: DEBUG oslo_vmware.api [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]522b7b0d-86b1-7c28-01d6-7567d602d347, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1099.886692] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1099.886966] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Creating directory with path [datastore2] vmware_temp/b92c38fd-a4f4-4e39-a84b-a62caeba0f5a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1099.887220] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d5faf443-d3b9-435a-933a-f3340806dc38 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.910208] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Created directory with path [datastore2] vmware_temp/b92c38fd-a4f4-4e39-a84b-a62caeba0f5a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1099.910425] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Fetch image to [datastore2] vmware_temp/b92c38fd-a4f4-4e39-a84b-a62caeba0f5a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1099.910425] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/b92c38fd-a4f4-4e39-a84b-a62caeba0f5a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1099.911246] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b79d25c-f3c4-4806-9a51-f4093f04f0af {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.919537] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ad10fdb-f12a-4deb-aaff-0069130583c8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.929320] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7044b1d8-9935-4fe7-8ac5-b3627111d1a2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.983150] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdd6f033-9200-485f-b065-2d0ab0d908a8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1099.991930] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a5ea3e34-d403-4366-8a30-3a28ae66492b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.023650] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1100.082230] env[68964]: DEBUG oslo_vmware.rw_handles [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b92c38fd-a4f4-4e39-a84b-a62caeba0f5a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1100.145189] env[68964]: DEBUG oslo_vmware.rw_handles [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1100.145400] env[68964]: DEBUG oslo_vmware.rw_handles [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b92c38fd-a4f4-4e39-a84b-a62caeba0f5a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1100.331122] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1100.331366] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1100.331540] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Deleting the datastore file [datastore2] 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1100.331813] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7a7a6258-7d48-4b17-b712-bde6ea41fe68 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1100.337925] env[68964]: DEBUG oslo_vmware.api [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Waiting for the task: (returnval){ [ 1100.337925] env[68964]: value = "task-3431629" [ 1100.337925] env[68964]: _type = "Task" [ 1100.337925] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1100.346434] env[68964]: DEBUG oslo_vmware.api [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Task: {'id': task-3431629, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1100.848452] env[68964]: DEBUG oslo_vmware.api [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Task: {'id': task-3431629, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.087203} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1100.848717] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1100.848901] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1100.849084] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1100.849265] env[68964]: INFO nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Took 1.50 seconds to destroy the instance on the hypervisor. [ 1100.851628] env[68964]: DEBUG nova.compute.claims [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1100.851907] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1100.852242] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1101.202733] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b94e16b0-900b-477e-aaed-e34c820a7d3e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.209968] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03b65e95-f205-43c1-b651-3527e6a90b2e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.240052] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a22636d2-33ff-4467-802b-193064ed7178 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.246903] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db191986-d68b-4648-94dc-5ed2492f7526 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.260600] env[68964]: DEBUG nova.compute.provider_tree [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1101.268534] env[68964]: DEBUG nova.scheduler.client.report [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1101.282445] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.430s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1101.283205] env[68964]: ERROR nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1101.283205] env[68964]: Faults: ['InvalidArgument'] [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Traceback (most recent call last): [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] self.driver.spawn(context, instance, image_meta, [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] self._fetch_image_if_missing(context, vi) [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] image_cache(vi, tmp_image_ds_loc) [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] vm_util.copy_virtual_disk( [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] session._wait_for_task(vmdk_copy_task) [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] return self.wait_for_task(task_ref) [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] return evt.wait() [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] result = hub.switch() [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] return self.greenlet.switch() [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] self.f(*self.args, **self.kw) [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] raise exceptions.translate_fault(task_info.error) [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Faults: ['InvalidArgument'] [ 1101.283205] env[68964]: ERROR nova.compute.manager [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] [ 1101.284149] env[68964]: DEBUG nova.compute.utils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1101.285289] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Build of instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a was re-scheduled: A specified parameter was not correct: fileType [ 1101.285289] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1101.285711] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1101.285926] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1101.286092] env[68964]: DEBUG nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1101.286258] env[68964]: DEBUG nova.network.neutron [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1101.656254] env[68964]: DEBUG nova.network.neutron [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1101.667312] env[68964]: INFO nova.compute.manager [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Took 0.38 seconds to deallocate network for instance. [ 1101.773612] env[68964]: INFO nova.scheduler.client.report [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Deleted allocations for instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a [ 1101.803257] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fd7d7d3-ae37-4a87-b682-1a79fbdc078d tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 431.686s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1101.804546] env[68964]: DEBUG oslo_concurrency.lockutils [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 234.047s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1101.804825] env[68964]: DEBUG oslo_concurrency.lockutils [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Acquiring lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1101.805107] env[68964]: DEBUG oslo_concurrency.lockutils [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1101.805331] env[68964]: DEBUG oslo_concurrency.lockutils [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1101.807388] env[68964]: INFO nova.compute.manager [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Terminating instance [ 1101.808932] env[68964]: DEBUG nova.compute.manager [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1101.809198] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1101.809700] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0cfd726e-cd78-4602-8602-c7c258945b94 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.819677] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b562e61e-2016-4fe2-8cf3-8b7f0c316f88 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1101.831188] env[68964]: DEBUG nova.compute.manager [None req-b1d4e238-f8c8-4a77-94c3-50f8bd6cb5c7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8b11bda8-2923-4641-869b-39e4fce369b4] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1101.854228] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a could not be found. [ 1101.854228] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1101.854228] env[68964]: INFO nova.compute.manager [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1101.854228] env[68964]: DEBUG oslo.service.loopingcall [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1101.854228] env[68964]: DEBUG nova.compute.manager [-] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1101.854228] env[68964]: DEBUG nova.network.neutron [-] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1101.860121] env[68964]: DEBUG nova.compute.manager [None req-b1d4e238-f8c8-4a77-94c3-50f8bd6cb5c7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8b11bda8-2923-4641-869b-39e4fce369b4] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1101.884896] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b1d4e238-f8c8-4a77-94c3-50f8bd6cb5c7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "8b11bda8-2923-4641-869b-39e4fce369b4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.433s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1101.889226] env[68964]: DEBUG nova.network.neutron [-] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1101.896676] env[68964]: DEBUG nova.compute.manager [None req-5e66c901-f711-434a-ad0f-8f298c733592 tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] [instance: d7af5b42-41ea-4cac-b9ae-66ec4b6d25f2] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1101.900259] env[68964]: INFO nova.compute.manager [-] [instance: 5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a] Took 0.05 seconds to deallocate network for instance. [ 1101.919431] env[68964]: DEBUG nova.compute.manager [None req-5e66c901-f711-434a-ad0f-8f298c733592 tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] [instance: d7af5b42-41ea-4cac-b9ae-66ec4b6d25f2] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1101.940601] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5e66c901-f711-434a-ad0f-8f298c733592 tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] Lock "d7af5b42-41ea-4cac-b9ae-66ec4b6d25f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.978s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1101.952231] env[68964]: DEBUG nova.compute.manager [None req-ba3845ee-bf0e-480e-b0ed-bff5d17c6ecf tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] [instance: 120c1330-9cdf-4db2-8c9f-1fa08dcad359] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1101.976537] env[68964]: DEBUG nova.compute.manager [None req-ba3845ee-bf0e-480e-b0ed-bff5d17c6ecf tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] [instance: 120c1330-9cdf-4db2-8c9f-1fa08dcad359] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1102.005201] env[68964]: DEBUG oslo_concurrency.lockutils [None req-bdcbd6c7-5245-424b-a860-d3051a0fd83a tempest-ImagesOneServerNegativeTestJSON-1641455724 tempest-ImagesOneServerNegativeTestJSON-1641455724-project-member] Lock "5c7f0f7a-d4dc-43e0-b08e-9ab883ef7a8a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.201s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1102.015261] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ba3845ee-bf0e-480e-b0ed-bff5d17c6ecf tempest-ServerRescueNegativeTestJSON-1616984841 tempest-ServerRescueNegativeTestJSON-1616984841-project-member] Lock "120c1330-9cdf-4db2-8c9f-1fa08dcad359" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.372s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1102.026101] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1102.074553] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1102.074809] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1102.076308] env[68964]: INFO nova.compute.claims [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1102.412983] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-beeedc11-fae3-4dfc-8a65-e666f26fe7d5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.420669] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37257889-ee9d-4064-81fb-2408a3143801 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.452068] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-979c3d3a-1c2b-4886-b64f-2dd616f4f2d1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.458928] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe02ee3f-7525-43db-b194-5d2191bbd992 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.472218] env[68964]: DEBUG nova.compute.provider_tree [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1102.481521] env[68964]: DEBUG nova.scheduler.client.report [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1102.497140] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.421s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1102.497140] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1102.537057] env[68964]: DEBUG nova.compute.utils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1102.538449] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1102.538618] env[68964]: DEBUG nova.network.neutron [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1102.549991] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1102.607911] env[68964]: DEBUG nova.policy [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1c60403dfb994698a29af773800a034d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '598d20cfa31f46f287517bc0e3e3cf2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1102.617037] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1102.644939] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1102.645215] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1102.645376] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1102.645554] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1102.645697] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1102.645839] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1102.646141] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1102.646329] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1102.646498] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1102.646657] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1102.646827] env[68964]: DEBUG nova.virt.hardware [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1102.647703] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-120eddd7-4c42-4720-896b-f2917c1e2092 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.655679] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24ed08b0-8d62-41b0-848e-1a9d6d4f4d51 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1102.937033] env[68964]: DEBUG nova.network.neutron [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Successfully created port: f80bcc30-1626-4709-bd0c-417881000634 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1103.620592] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "4d272615-e2dd-4540-88d0-4a209f559147" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1103.728248] env[68964]: DEBUG nova.network.neutron [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Successfully updated port: f80bcc30-1626-4709-bd0c-417881000634 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1103.747532] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "refresh_cache-4d272615-e2dd-4540-88d0-4a209f559147" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1103.747532] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquired lock "refresh_cache-4d272615-e2dd-4540-88d0-4a209f559147" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1103.747532] env[68964]: DEBUG nova.network.neutron [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1103.806031] env[68964]: DEBUG nova.network.neutron [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1103.865725] env[68964]: DEBUG nova.compute.manager [req-2f567d07-5f69-435c-b2b7-3599291d9c81 req-50607c4a-2619-47f5-b041-9db2d8c49e00 service nova] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Received event network-vif-plugged-f80bcc30-1626-4709-bd0c-417881000634 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1103.866016] env[68964]: DEBUG oslo_concurrency.lockutils [req-2f567d07-5f69-435c-b2b7-3599291d9c81 req-50607c4a-2619-47f5-b041-9db2d8c49e00 service nova] Acquiring lock "4d272615-e2dd-4540-88d0-4a209f559147-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1103.866155] env[68964]: DEBUG oslo_concurrency.lockutils [req-2f567d07-5f69-435c-b2b7-3599291d9c81 req-50607c4a-2619-47f5-b041-9db2d8c49e00 service nova] Lock "4d272615-e2dd-4540-88d0-4a209f559147-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1103.866325] env[68964]: DEBUG oslo_concurrency.lockutils [req-2f567d07-5f69-435c-b2b7-3599291d9c81 req-50607c4a-2619-47f5-b041-9db2d8c49e00 service nova] Lock "4d272615-e2dd-4540-88d0-4a209f559147-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1103.866491] env[68964]: DEBUG nova.compute.manager [req-2f567d07-5f69-435c-b2b7-3599291d9c81 req-50607c4a-2619-47f5-b041-9db2d8c49e00 service nova] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] No waiting events found dispatching network-vif-plugged-f80bcc30-1626-4709-bd0c-417881000634 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1103.866653] env[68964]: WARNING nova.compute.manager [req-2f567d07-5f69-435c-b2b7-3599291d9c81 req-50607c4a-2619-47f5-b041-9db2d8c49e00 service nova] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Received unexpected event network-vif-plugged-f80bcc30-1626-4709-bd0c-417881000634 for instance with vm_state building and task_state deleting. [ 1104.028973] env[68964]: DEBUG nova.network.neutron [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Updating instance_info_cache with network_info: [{"id": "f80bcc30-1626-4709-bd0c-417881000634", "address": "fa:16:3e:a9:b6:7e", "network": {"id": "6b837e8b-6389-4429-ad21-709304e920fc", "bridge": "br-int", "label": "tempest-ImagesTestJSON-125611246-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "598d20cfa31f46f287517bc0e3e3cf2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b2ede0e6-8d7a-4018-bb37-25bf388e9867", "external-id": "nsx-vlan-transportzone-945", "segmentation_id": 945, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf80bcc30-16", "ovs_interfaceid": "f80bcc30-1626-4709-bd0c-417881000634", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1104.042593] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Releasing lock "refresh_cache-4d272615-e2dd-4540-88d0-4a209f559147" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1104.042944] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Instance network_info: |[{"id": "f80bcc30-1626-4709-bd0c-417881000634", "address": "fa:16:3e:a9:b6:7e", "network": {"id": "6b837e8b-6389-4429-ad21-709304e920fc", "bridge": "br-int", "label": "tempest-ImagesTestJSON-125611246-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "598d20cfa31f46f287517bc0e3e3cf2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b2ede0e6-8d7a-4018-bb37-25bf388e9867", "external-id": "nsx-vlan-transportzone-945", "segmentation_id": 945, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf80bcc30-16", "ovs_interfaceid": "f80bcc30-1626-4709-bd0c-417881000634", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1104.043346] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a9:b6:7e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b2ede0e6-8d7a-4018-bb37-25bf388e9867', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f80bcc30-1626-4709-bd0c-417881000634', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1104.050899] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Creating folder: Project (598d20cfa31f46f287517bc0e3e3cf2b). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1104.051782] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-34ad52f9-7a9f-4241-a8a8-ad7279bf0673 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.062366] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Created folder: Project (598d20cfa31f46f287517bc0e3e3cf2b) in parent group-v684465. [ 1104.062551] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Creating folder: Instances. Parent ref: group-v684543. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1104.062789] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e317bd67-a808-4fb3-9a02-09197a4636b3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.072022] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Created folder: Instances in parent group-v684543. [ 1104.072022] env[68964]: DEBUG oslo.service.loopingcall [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1104.072022] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1104.072022] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3cc76dd9-9408-4439-b9a3-b6ef3b3f0dc5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.090820] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1104.090820] env[68964]: value = "task-3431632" [ 1104.090820] env[68964]: _type = "Task" [ 1104.090820] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1104.097941] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431632, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1104.602025] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431632, 'name': CreateVM_Task, 'duration_secs': 0.278852} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1104.602025] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1104.602167] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1104.602211] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1104.602508] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1104.602751] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-80335b4f-1e08-4d59-b129-223dddb51213 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1104.607126] env[68964]: DEBUG oslo_vmware.api [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Waiting for the task: (returnval){ [ 1104.607126] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52444e55-3614-3544-5cac-188e2ddbdc15" [ 1104.607126] env[68964]: _type = "Task" [ 1104.607126] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1104.614941] env[68964]: DEBUG oslo_vmware.api [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52444e55-3614-3544-5cac-188e2ddbdc15, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1105.118145] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1105.118474] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1105.118666] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1105.925322] env[68964]: DEBUG nova.compute.manager [req-680e3c96-22b8-4a47-8e74-7e2508aa27f2 req-2ee5f1e0-9946-4071-a8cc-5078ad22a9a4 service nova] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Received event network-changed-f80bcc30-1626-4709-bd0c-417881000634 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1105.925491] env[68964]: DEBUG nova.compute.manager [req-680e3c96-22b8-4a47-8e74-7e2508aa27f2 req-2ee5f1e0-9946-4071-a8cc-5078ad22a9a4 service nova] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Refreshing instance network info cache due to event network-changed-f80bcc30-1626-4709-bd0c-417881000634. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1105.925731] env[68964]: DEBUG oslo_concurrency.lockutils [req-680e3c96-22b8-4a47-8e74-7e2508aa27f2 req-2ee5f1e0-9946-4071-a8cc-5078ad22a9a4 service nova] Acquiring lock "refresh_cache-4d272615-e2dd-4540-88d0-4a209f559147" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1105.925899] env[68964]: DEBUG oslo_concurrency.lockutils [req-680e3c96-22b8-4a47-8e74-7e2508aa27f2 req-2ee5f1e0-9946-4071-a8cc-5078ad22a9a4 service nova] Acquired lock "refresh_cache-4d272615-e2dd-4540-88d0-4a209f559147" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1105.926112] env[68964]: DEBUG nova.network.neutron [req-680e3c96-22b8-4a47-8e74-7e2508aa27f2 req-2ee5f1e0-9946-4071-a8cc-5078ad22a9a4 service nova] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Refreshing network info cache for port f80bcc30-1626-4709-bd0c-417881000634 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1106.459841] env[68964]: DEBUG nova.network.neutron [req-680e3c96-22b8-4a47-8e74-7e2508aa27f2 req-2ee5f1e0-9946-4071-a8cc-5078ad22a9a4 service nova] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Updated VIF entry in instance network info cache for port f80bcc30-1626-4709-bd0c-417881000634. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1106.460302] env[68964]: DEBUG nova.network.neutron [req-680e3c96-22b8-4a47-8e74-7e2508aa27f2 req-2ee5f1e0-9946-4071-a8cc-5078ad22a9a4 service nova] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Updating instance_info_cache with network_info: [{"id": "f80bcc30-1626-4709-bd0c-417881000634", "address": "fa:16:3e:a9:b6:7e", "network": {"id": "6b837e8b-6389-4429-ad21-709304e920fc", "bridge": "br-int", "label": "tempest-ImagesTestJSON-125611246-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "598d20cfa31f46f287517bc0e3e3cf2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b2ede0e6-8d7a-4018-bb37-25bf388e9867", "external-id": "nsx-vlan-transportzone-945", "segmentation_id": 945, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf80bcc30-16", "ovs_interfaceid": "f80bcc30-1626-4709-bd0c-417881000634", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1106.469782] env[68964]: DEBUG oslo_concurrency.lockutils [req-680e3c96-22b8-4a47-8e74-7e2508aa27f2 req-2ee5f1e0-9946-4071-a8cc-5078ad22a9a4 service nova] Releasing lock "refresh_cache-4d272615-e2dd-4540-88d0-4a209f559147" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1110.017232] env[68964]: WARNING oslo_vmware.rw_handles [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1110.017232] env[68964]: ERROR oslo_vmware.rw_handles [ 1110.017824] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/7d9d2bba-1b55-47b9-a7b7-3145f7fafd46/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1110.019737] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1110.019970] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Copying Virtual Disk [datastore1] vmware_temp/7d9d2bba-1b55-47b9-a7b7-3145f7fafd46/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/7d9d2bba-1b55-47b9-a7b7-3145f7fafd46/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1110.020275] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b79a9eaa-9002-41e5-989e-94d653c15787 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.029575] env[68964]: DEBUG oslo_vmware.api [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Waiting for the task: (returnval){ [ 1110.029575] env[68964]: value = "task-3431633" [ 1110.029575] env[68964]: _type = "Task" [ 1110.029575] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1110.038503] env[68964]: DEBUG oslo_vmware.api [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Task: {'id': task-3431633, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1110.539758] env[68964]: DEBUG oslo_vmware.exceptions [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1110.540047] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1110.540590] env[68964]: ERROR nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1110.540590] env[68964]: Faults: ['InvalidArgument'] [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Traceback (most recent call last): [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] yield resources [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] self.driver.spawn(context, instance, image_meta, [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] self._fetch_image_if_missing(context, vi) [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] image_cache(vi, tmp_image_ds_loc) [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] vm_util.copy_virtual_disk( [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] session._wait_for_task(vmdk_copy_task) [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] return self.wait_for_task(task_ref) [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] return evt.wait() [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] result = hub.switch() [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] return self.greenlet.switch() [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] self.f(*self.args, **self.kw) [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] raise exceptions.translate_fault(task_info.error) [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Faults: ['InvalidArgument'] [ 1110.540590] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] [ 1110.541599] env[68964]: INFO nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Terminating instance [ 1110.542415] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1110.542620] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1110.543264] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1110.543452] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1110.543668] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4217a3a0-34cf-4beb-88e5-69f662de1724 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.545796] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e1b580d-7ccb-4333-801d-59ecc0e1f8a9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.552265] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1110.552467] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6ec5d285-e261-4353-a645-728a2c8ea2ea {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.554538] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1110.554711] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1110.555622] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a46c23be-d784-4046-8cc0-93ec302f67c7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.560303] env[68964]: DEBUG oslo_vmware.api [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Waiting for the task: (returnval){ [ 1110.560303] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]529cd729-b2a4-3079-9fbd-a789f6bc488c" [ 1110.560303] env[68964]: _type = "Task" [ 1110.560303] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1110.570098] env[68964]: DEBUG oslo_vmware.api [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]529cd729-b2a4-3079-9fbd-a789f6bc488c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1110.616052] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1110.616229] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1110.616409] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Deleting the datastore file [datastore1] ace6f24b-1fd4-4db2-a232-ac80b1f810d5 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1110.616659] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5ebfe6cd-821b-4809-a0da-061e6b5c07da {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1110.622923] env[68964]: DEBUG oslo_vmware.api [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Waiting for the task: (returnval){ [ 1110.622923] env[68964]: value = "task-3431635" [ 1110.622923] env[68964]: _type = "Task" [ 1110.622923] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1110.631583] env[68964]: DEBUG oslo_vmware.api [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Task: {'id': task-3431635, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1111.070990] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1111.070990] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Creating directory with path [datastore1] vmware_temp/a7461050-b058-4501-b422-99cf76e90fe8/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1111.071344] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1fdec6fb-829e-4ce7-abd1-f0ad31656a64 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.082383] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Created directory with path [datastore1] vmware_temp/a7461050-b058-4501-b422-99cf76e90fe8/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1111.082575] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Fetch image to [datastore1] vmware_temp/a7461050-b058-4501-b422-99cf76e90fe8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1111.082741] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/a7461050-b058-4501-b422-99cf76e90fe8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1111.083527] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e53bfca2-1c87-48b0-a478-bfa2d66de3a1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.090009] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d80353dc-07bf-4c0f-b984-1ba0d3c88dba {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.099041] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-409484af-83a8-4fe7-9a20-c5d12aa7bc5a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.134256] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc6607e2-549c-4f01-9af6-7f1d1d10bc16 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.141752] env[68964]: DEBUG oslo_vmware.api [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Task: {'id': task-3431635, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065225} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1111.143309] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1111.143498] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1111.143691] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1111.143839] env[68964]: INFO nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1111.145702] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0b6e700e-abc2-426a-9672-5652100b6218 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.147661] env[68964]: DEBUG nova.compute.claims [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1111.147833] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1111.148072] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1111.168714] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1111.223687] env[68964]: DEBUG oslo_vmware.rw_handles [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7461050-b058-4501-b422-99cf76e90fe8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1111.284102] env[68964]: DEBUG oslo_vmware.rw_handles [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1111.284320] env[68964]: DEBUG oslo_vmware.rw_handles [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7461050-b058-4501-b422-99cf76e90fe8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1111.534606] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84d0981c-a069-47fe-95bc-680a71d1a27a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.541934] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34077525-ea50-4515-8f00-f315ea788121 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.572186] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-128182ee-9fc0-4017-8874-7cbf9d41a18c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.578998] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce7a9a6e-fe04-4467-85b8-30fe47f785db {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.591709] env[68964]: DEBUG nova.compute.provider_tree [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1111.599885] env[68964]: DEBUG nova.scheduler.client.report [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1111.614151] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.466s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1111.614671] env[68964]: ERROR nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1111.614671] env[68964]: Faults: ['InvalidArgument'] [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Traceback (most recent call last): [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] self.driver.spawn(context, instance, image_meta, [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] self._fetch_image_if_missing(context, vi) [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] image_cache(vi, tmp_image_ds_loc) [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] vm_util.copy_virtual_disk( [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] session._wait_for_task(vmdk_copy_task) [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] return self.wait_for_task(task_ref) [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] return evt.wait() [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] result = hub.switch() [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] return self.greenlet.switch() [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] self.f(*self.args, **self.kw) [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] raise exceptions.translate_fault(task_info.error) [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Faults: ['InvalidArgument'] [ 1111.614671] env[68964]: ERROR nova.compute.manager [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] [ 1111.615420] env[68964]: DEBUG nova.compute.utils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1111.616708] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Build of instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 was re-scheduled: A specified parameter was not correct: fileType [ 1111.616708] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1111.617088] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1111.617265] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1111.617432] env[68964]: DEBUG nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1111.617592] env[68964]: DEBUG nova.network.neutron [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1111.906361] env[68964]: DEBUG nova.network.neutron [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1111.921174] env[68964]: INFO nova.compute.manager [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Took 0.30 seconds to deallocate network for instance. [ 1112.034020] env[68964]: INFO nova.scheduler.client.report [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Deleted allocations for instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 [ 1112.061576] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c82c3d0a-33bf-4583-b351-d2194a37ee78 tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 392.334s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1112.061576] env[68964]: DEBUG oslo_concurrency.lockutils [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 194.230s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1112.061576] env[68964]: DEBUG oslo_concurrency.lockutils [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquiring lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1112.061576] env[68964]: DEBUG oslo_concurrency.lockutils [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1112.061576] env[68964]: DEBUG oslo_concurrency.lockutils [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1112.066768] env[68964]: INFO nova.compute.manager [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Terminating instance [ 1112.068927] env[68964]: DEBUG oslo_concurrency.lockutils [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquiring lock "refresh_cache-ace6f24b-1fd4-4db2-a232-ac80b1f810d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1112.069117] env[68964]: DEBUG oslo_concurrency.lockutils [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Acquired lock "refresh_cache-ace6f24b-1fd4-4db2-a232-ac80b1f810d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1112.069303] env[68964]: DEBUG nova.network.neutron [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1112.089627] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1112.098416] env[68964]: DEBUG nova.network.neutron [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1112.143835] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1112.144521] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1112.147025] env[68964]: INFO nova.compute.claims [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1112.276783] env[68964]: DEBUG nova.network.neutron [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1112.285965] env[68964]: DEBUG oslo_concurrency.lockutils [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Releasing lock "refresh_cache-ace6f24b-1fd4-4db2-a232-ac80b1f810d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1112.286400] env[68964]: DEBUG nova.compute.manager [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1112.286597] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1112.287135] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bf32d84c-4fb9-4e5f-a404-a4a216b1e49c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.296923] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e9a566c-28a4-4ddc-bed8-6b73d33f1325 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.327415] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ace6f24b-1fd4-4db2-a232-ac80b1f810d5 could not be found. [ 1112.327623] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1112.327801] env[68964]: INFO nova.compute.manager [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1112.328051] env[68964]: DEBUG oslo.service.loopingcall [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1112.330487] env[68964]: DEBUG nova.compute.manager [-] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1112.330595] env[68964]: DEBUG nova.network.neutron [-] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1112.347592] env[68964]: DEBUG nova.network.neutron [-] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1112.354843] env[68964]: DEBUG nova.network.neutron [-] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1112.364316] env[68964]: INFO nova.compute.manager [-] [instance: ace6f24b-1fd4-4db2-a232-ac80b1f810d5] Took 0.03 seconds to deallocate network for instance. [ 1112.470837] env[68964]: DEBUG oslo_concurrency.lockutils [None req-deae2fe1-897f-4752-bd41-fb8512cd3e6e tempest-ServersV294TestFqdnHostnames-1192146358 tempest-ServersV294TestFqdnHostnames-1192146358-project-member] Lock "ace6f24b-1fd4-4db2-a232-ac80b1f810d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.411s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1112.545549] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-932b18c5-6772-45f7-9bf5-befdb74d1b38 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.553479] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-273d8a7b-849c-4da0-8e2a-797234927e2b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.584879] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ae6a88d-8ed2-4946-99c7-e0711c57b98e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.591928] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ecd0228-d226-4e26-893f-1ecad9a5b32d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.605097] env[68964]: DEBUG nova.compute.provider_tree [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1112.615380] env[68964]: DEBUG nova.scheduler.client.report [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1112.629927] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.485s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1112.630127] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1112.664128] env[68964]: DEBUG nova.compute.utils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1112.665718] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1112.666080] env[68964]: DEBUG nova.network.neutron [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1112.674697] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1112.739715] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1112.765432] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1112.765690] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1112.765844] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1112.766029] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1112.766179] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1112.766322] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1112.766525] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1112.766725] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1112.766827] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1112.767161] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1112.767426] env[68964]: DEBUG nova.virt.hardware [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1112.768375] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ea0cfc8-07f7-4560-a0b3-cd460c112022 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.772134] env[68964]: DEBUG nova.policy [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '09dcfa1c27c6474ba1138c7700157495', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a2de969062fc4ff09a523f59d0030e2e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1112.778860] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad361648-0914-45f5-bd7a-711a7b6592ad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.389226] env[68964]: DEBUG nova.network.neutron [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Successfully created port: f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1113.650263] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.163780] env[68964]: DEBUG nova.compute.manager [req-27499410-593a-4946-92e6-fd5cebe6555f req-75902dcb-8ec4-4147-9e74-e55c4999408b service nova] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Received event network-vif-plugged-f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1114.164019] env[68964]: DEBUG oslo_concurrency.lockutils [req-27499410-593a-4946-92e6-fd5cebe6555f req-75902dcb-8ec4-4147-9e74-e55c4999408b service nova] Acquiring lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.164286] env[68964]: DEBUG oslo_concurrency.lockutils [req-27499410-593a-4946-92e6-fd5cebe6555f req-75902dcb-8ec4-4147-9e74-e55c4999408b service nova] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1114.164442] env[68964]: DEBUG oslo_concurrency.lockutils [req-27499410-593a-4946-92e6-fd5cebe6555f req-75902dcb-8ec4-4147-9e74-e55c4999408b service nova] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1114.164605] env[68964]: DEBUG nova.compute.manager [req-27499410-593a-4946-92e6-fd5cebe6555f req-75902dcb-8ec4-4147-9e74-e55c4999408b service nova] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] No waiting events found dispatching network-vif-plugged-f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1114.164763] env[68964]: WARNING nova.compute.manager [req-27499410-593a-4946-92e6-fd5cebe6555f req-75902dcb-8ec4-4147-9e74-e55c4999408b service nova] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Received unexpected event network-vif-plugged-f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d for instance with vm_state building and task_state deleting. [ 1114.281181] env[68964]: DEBUG nova.network.neutron [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Successfully updated port: f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1114.294929] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "refresh_cache-94ca6313-24cc-40cb-ac20-9e9c8205a9d8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1114.295122] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquired lock "refresh_cache-94ca6313-24cc-40cb-ac20-9e9c8205a9d8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1114.295352] env[68964]: DEBUG nova.network.neutron [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1114.360442] env[68964]: DEBUG nova.network.neutron [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1114.616750] env[68964]: DEBUG nova.network.neutron [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Updating instance_info_cache with network_info: [{"id": "f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d", "address": "fa:16:3e:a4:e7:ca", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.31", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf9c3ede0-f9", "ovs_interfaceid": "f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1114.629772] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Releasing lock "refresh_cache-94ca6313-24cc-40cb-ac20-9e9c8205a9d8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1114.630104] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Instance network_info: |[{"id": "f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d", "address": "fa:16:3e:a4:e7:ca", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.31", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf9c3ede0-f9", "ovs_interfaceid": "f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1114.630492] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a4:e7:ca', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f9c4edd5-d88e-4996-afea-00130ace0dad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1114.638274] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Creating folder: Project (a2de969062fc4ff09a523f59d0030e2e). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1114.638963] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6f956318-6574-4676-8a83-86d822920b49 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.651697] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Created folder: Project (a2de969062fc4ff09a523f59d0030e2e) in parent group-v684465. [ 1114.651895] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Creating folder: Instances. Parent ref: group-v684546. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1114.652137] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-52fcb028-f3a1-4bf6-ad04-77b6c5b28701 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.661084] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Created folder: Instances in parent group-v684546. [ 1114.661338] env[68964]: DEBUG oslo.service.loopingcall [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1114.661505] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1114.661723] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a5c1ab2f-6ef7-483b-9afb-a450b6359476 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.681305] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1114.681305] env[68964]: value = "task-3431638" [ 1114.681305] env[68964]: _type = "Task" [ 1114.681305] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1114.689597] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431638, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1115.192709] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431638, 'name': CreateVM_Task, 'duration_secs': 0.327732} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1115.192979] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1115.194110] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1115.194353] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1115.194844] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1115.195159] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-723f3dc5-f275-44e5-9abe-47e48ae39fc0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.201602] env[68964]: DEBUG oslo_vmware.api [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Waiting for the task: (returnval){ [ 1115.201602] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52c5582f-4f0b-5207-2b86-44d63049922e" [ 1115.201602] env[68964]: _type = "Task" [ 1115.201602] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1115.210558] env[68964]: DEBUG oslo_vmware.api [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52c5582f-4f0b-5207-2b86-44d63049922e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1115.712981] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1115.713307] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1115.713572] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1116.711258] env[68964]: DEBUG nova.compute.manager [req-25614e34-edf9-4354-945c-dff5e34de0c9 req-a2aba656-c7d4-46cc-9560-f41e1f917027 service nova] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Received event network-changed-f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1116.711511] env[68964]: DEBUG nova.compute.manager [req-25614e34-edf9-4354-945c-dff5e34de0c9 req-a2aba656-c7d4-46cc-9560-f41e1f917027 service nova] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Refreshing instance network info cache due to event network-changed-f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1116.711753] env[68964]: DEBUG oslo_concurrency.lockutils [req-25614e34-edf9-4354-945c-dff5e34de0c9 req-a2aba656-c7d4-46cc-9560-f41e1f917027 service nova] Acquiring lock "refresh_cache-94ca6313-24cc-40cb-ac20-9e9c8205a9d8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1116.711904] env[68964]: DEBUG oslo_concurrency.lockutils [req-25614e34-edf9-4354-945c-dff5e34de0c9 req-a2aba656-c7d4-46cc-9560-f41e1f917027 service nova] Acquired lock "refresh_cache-94ca6313-24cc-40cb-ac20-9e9c8205a9d8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1116.719029] env[68964]: DEBUG nova.network.neutron [req-25614e34-edf9-4354-945c-dff5e34de0c9 req-a2aba656-c7d4-46cc-9560-f41e1f917027 service nova] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Refreshing network info cache for port f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1116.926459] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1116.926714] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1117.160756] env[68964]: DEBUG nova.network.neutron [req-25614e34-edf9-4354-945c-dff5e34de0c9 req-a2aba656-c7d4-46cc-9560-f41e1f917027 service nova] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Updated VIF entry in instance network info cache for port f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1117.161169] env[68964]: DEBUG nova.network.neutron [req-25614e34-edf9-4354-945c-dff5e34de0c9 req-a2aba656-c7d4-46cc-9560-f41e1f917027 service nova] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Updating instance_info_cache with network_info: [{"id": "f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d", "address": "fa:16:3e:a4:e7:ca", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.31", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf9c3ede0-f9", "ovs_interfaceid": "f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1117.170559] env[68964]: DEBUG oslo_concurrency.lockutils [req-25614e34-edf9-4354-945c-dff5e34de0c9 req-a2aba656-c7d4-46cc-9560-f41e1f917027 service nova] Releasing lock "refresh_cache-94ca6313-24cc-40cb-ac20-9e9c8205a9d8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1123.297798] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquiring lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1123.298067] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1136.551150] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1136.551446] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1136.594318] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "15faae57-ab24-417e-9bf2-1aee11ccc2f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1136.594657] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "15faae57-ab24-417e-9bf2-1aee11ccc2f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1143.725032] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1143.725324] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1144.725043] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1145.719350] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1145.724072] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1146.570574] env[68964]: WARNING oslo_vmware.rw_handles [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1146.570574] env[68964]: ERROR oslo_vmware.rw_handles [ 1146.570574] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/b92c38fd-a4f4-4e39-a84b-a62caeba0f5a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1146.572686] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1146.573283] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Copying Virtual Disk [datastore2] vmware_temp/b92c38fd-a4f4-4e39-a84b-a62caeba0f5a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/b92c38fd-a4f4-4e39-a84b-a62caeba0f5a/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1146.573283] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3a044d69-235f-459d-9598-6931252f689c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.582902] env[68964]: DEBUG oslo_vmware.api [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Waiting for the task: (returnval){ [ 1146.582902] env[68964]: value = "task-3431639" [ 1146.582902] env[68964]: _type = "Task" [ 1146.582902] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1146.590774] env[68964]: DEBUG oslo_vmware.api [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Task: {'id': task-3431639, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1146.724536] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1146.724711] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1146.724920] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1146.738056] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1146.738302] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1146.738484] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1146.738642] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1146.739774] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4190270-34dc-430d-9c55-7553c2aac261 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.748670] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b483f3b1-5ddc-449d-a47b-f836dad24902 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.763326] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8da627c2-ea51-4185-9f56-29d545f73347 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.770953] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7941a86-91d8-482d-a24c-5819f75943ba {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1146.800923] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1146.801101] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1146.801308] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1146.880972] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.881146] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ff09bdbb-84e3-4182-8118-e99512a0e9de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.881273] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3770333e-4721-424d-ac86-2291c002e99a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.881396] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.881518] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.881632] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f3f326c-2127-426e-a137-6f33512f4cb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.881745] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.881882] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 238794bb-9995-4bb0-954d-7ca0ef825e19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.881989] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4d272615-e2dd-4540-88d0-4a209f559147 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.882090] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1146.893188] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c1efc344-848b-4a98-a20a-57ebdfb5ac8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1146.904183] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 244140d1-bf22-415a-b770-05f2fe106149 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1146.916008] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07d247db-b7ca-4b5f-818f-17411296d08f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1146.928058] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96c1b70b-9a17-46b1-999d-558b85c77d22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1146.938031] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a317d842-0282-4ace-a457-d8031cf0adca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1146.948958] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1146.959752] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 704ec14b-410e-4175-b032-69074b332d87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1146.970487] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1146.980993] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1146.991016] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1147.001394] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 15faae57-ab24-417e-9bf2-1aee11ccc2f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1147.001625] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1147.001772] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1147.094329] env[68964]: DEBUG oslo_vmware.exceptions [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1147.094617] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1147.095196] env[68964]: ERROR nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.095196] env[68964]: Faults: ['InvalidArgument'] [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Traceback (most recent call last): [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] yield resources [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] self.driver.spawn(context, instance, image_meta, [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] self._fetch_image_if_missing(context, vi) [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] image_cache(vi, tmp_image_ds_loc) [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] vm_util.copy_virtual_disk( [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] session._wait_for_task(vmdk_copy_task) [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] return self.wait_for_task(task_ref) [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] return evt.wait() [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] result = hub.switch() [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] return self.greenlet.switch() [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] self.f(*self.args, **self.kw) [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] raise exceptions.translate_fault(task_info.error) [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Faults: ['InvalidArgument'] [ 1147.095196] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] [ 1147.096596] env[68964]: INFO nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Terminating instance [ 1147.096945] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1147.097235] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.097835] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1147.098038] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1147.098259] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-311c06fb-070f-4f57-8939-bc0cd7791b2a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.100472] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0599d774-514c-475e-bb91-91b3931a8410 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.109386] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1147.109598] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-123f7423-55b5-4d61-96f1-7b8a1be9effc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.111700] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.111868] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1147.112814] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8cd336b4-80b0-420e-ae45-2a8500e2bec3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.119773] env[68964]: DEBUG oslo_vmware.api [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for the task: (returnval){ [ 1147.119773] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d193a4-b851-a94f-554c-650daee18ce5" [ 1147.119773] env[68964]: _type = "Task" [ 1147.119773] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1147.126652] env[68964]: DEBUG oslo_vmware.api [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d193a4-b851-a94f-554c-650daee18ce5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1147.236224] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e2588d2-9b4c-4051-8096-8e9a33f0a6dd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.244045] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39e49a74-9340-49cf-940e-c9a07604b50f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.275232] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-853b6a5e-a212-4a26-900b-34649a05fe95 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.282196] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-565152c7-18f5-40fc-bc33-9e5bb432b772 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.295250] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1147.303675] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1147.316953] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1147.317159] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.516s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1147.629562] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1147.629840] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Creating directory with path [datastore2] vmware_temp/3b65a337-b5cd-4422-9e27-5f2f1b02672e/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.630074] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e2b4b570-1c37-4eae-8a23-3b33ca7f35d9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.649402] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Created directory with path [datastore2] vmware_temp/3b65a337-b5cd-4422-9e27-5f2f1b02672e/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.649805] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Fetch image to [datastore2] vmware_temp/3b65a337-b5cd-4422-9e27-5f2f1b02672e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1147.649926] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/3b65a337-b5cd-4422-9e27-5f2f1b02672e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1147.650661] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b80723f2-4b7c-47f5-b1c7-ea8d0c5e5ed2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.657925] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20c3a125-68b7-41f7-9447-96f102752343 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.667525] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-228194b4-e889-4947-9faf-a41018b6db74 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.701257] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79acfaa7-c4cf-48e5-a51b-5eebdf14910d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.706952] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c91ee5b9-62b5-4d45-98c2-bab1d667c83c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1147.730980] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1147.779553] env[68964]: DEBUG oslo_vmware.rw_handles [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3b65a337-b5cd-4422-9e27-5f2f1b02672e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1147.839861] env[68964]: DEBUG oslo_vmware.rw_handles [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1147.840194] env[68964]: DEBUG oslo_vmware.rw_handles [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3b65a337-b5cd-4422-9e27-5f2f1b02672e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1149.318074] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1149.318420] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1149.318420] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1149.340861] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.341057] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.341273] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.341441] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.341594] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.341741] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.341888] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.342044] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.342368] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.342368] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1149.342435] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1149.343017] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1150.692209] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1150.692554] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1150.692640] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Deleting the datastore file [datastore2] e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1150.694036] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cf6642a7-2c48-42d2-923f-5ff3a4d89d60 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1150.698756] env[68964]: DEBUG oslo_vmware.api [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Waiting for the task: (returnval){ [ 1150.698756] env[68964]: value = "task-3431641" [ 1150.698756] env[68964]: _type = "Task" [ 1150.698756] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1150.706473] env[68964]: DEBUG oslo_vmware.api [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Task: {'id': task-3431641, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1151.208954] env[68964]: DEBUG oslo_vmware.api [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Task: {'id': task-3431641, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07509} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1151.209236] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1151.209420] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1151.209592] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1151.209763] env[68964]: INFO nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Took 4.11 seconds to destroy the instance on the hypervisor. [ 1151.212410] env[68964]: DEBUG nova.compute.claims [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1151.212582] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1151.212792] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1151.511045] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db1408af-a2e4-4f7e-81d5-39c22321d9f1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.518361] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a761d35a-4b56-402c-abfa-f42f10f76957 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.547585] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e6ebdaa-a32d-44f0-a992-c61d55d580f0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.554125] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2d995d6-8193-434e-b0ff-499734b6fb8c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1151.567480] env[68964]: DEBUG nova.compute.provider_tree [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1151.575489] env[68964]: DEBUG nova.scheduler.client.report [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1151.588175] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.375s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1151.588751] env[68964]: ERROR nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1151.588751] env[68964]: Faults: ['InvalidArgument'] [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Traceback (most recent call last): [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] self.driver.spawn(context, instance, image_meta, [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] self._fetch_image_if_missing(context, vi) [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] image_cache(vi, tmp_image_ds_loc) [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] vm_util.copy_virtual_disk( [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] session._wait_for_task(vmdk_copy_task) [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] return self.wait_for_task(task_ref) [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] return evt.wait() [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] result = hub.switch() [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] return self.greenlet.switch() [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] self.f(*self.args, **self.kw) [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] raise exceptions.translate_fault(task_info.error) [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Faults: ['InvalidArgument'] [ 1151.588751] env[68964]: ERROR nova.compute.manager [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] [ 1151.589598] env[68964]: DEBUG nova.compute.utils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1151.590889] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Build of instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa was re-scheduled: A specified parameter was not correct: fileType [ 1151.590889] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1151.591278] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1151.591448] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1151.591612] env[68964]: DEBUG nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1151.591768] env[68964]: DEBUG nova.network.neutron [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1151.916801] env[68964]: DEBUG nova.network.neutron [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1151.928536] env[68964]: INFO nova.compute.manager [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Took 0.34 seconds to deallocate network for instance. [ 1152.023668] env[68964]: INFO nova.scheduler.client.report [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Deleted allocations for instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa [ 1152.069703] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7d0e64a-baa4-4c2f-bc84-6a07bf7d09f5 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 478.989s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.072022] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 278.085s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1152.072022] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Acquiring lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1152.072022] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1152.072022] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.074283] env[68964]: INFO nova.compute.manager [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Terminating instance [ 1152.076499] env[68964]: DEBUG nova.compute.manager [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1152.076683] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1152.076979] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-23cf218b-ccb7-4d29-8521-4115bb7d8c07 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.087114] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91ff9e0e-3e60-4c7e-b59e-91c55d452ad4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.099357] env[68964]: DEBUG nova.compute.manager [None req-598656b9-502b-4cef-9794-da74285cda21 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: 66318915-69a7-4f3a-8aa2-377948732cc5] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1152.121475] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa could not be found. [ 1152.121684] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1152.121860] env[68964]: INFO nova.compute.manager [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1152.122209] env[68964]: DEBUG oslo.service.loopingcall [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1152.122364] env[68964]: DEBUG nova.compute.manager [-] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1152.122488] env[68964]: DEBUG nova.network.neutron [-] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1152.129120] env[68964]: DEBUG nova.compute.manager [None req-598656b9-502b-4cef-9794-da74285cda21 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: 66318915-69a7-4f3a-8aa2-377948732cc5] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1152.154516] env[68964]: DEBUG oslo_concurrency.lockutils [None req-598656b9-502b-4cef-9794-da74285cda21 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "66318915-69a7-4f3a-8aa2-377948732cc5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.393s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.159863] env[68964]: DEBUG nova.network.neutron [-] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1152.168081] env[68964]: DEBUG nova.compute.manager [None req-ef551ca1-d475-435a-b88a-a49a772cb711 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] [instance: 5c076ffe-9532-4d57-b044-a74a48cb147d] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1152.171049] env[68964]: INFO nova.compute.manager [-] [instance: e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa] Took 0.05 seconds to deallocate network for instance. [ 1152.195453] env[68964]: DEBUG nova.compute.manager [None req-ef551ca1-d475-435a-b88a-a49a772cb711 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] [instance: 5c076ffe-9532-4d57-b044-a74a48cb147d] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1152.224059] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ef551ca1-d475-435a-b88a-a49a772cb711 tempest-VolumesAdminNegativeTest-840646964 tempest-VolumesAdminNegativeTest-840646964-project-member] Lock "5c076ffe-9532-4d57-b044-a74a48cb147d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.374s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.238407] env[68964]: DEBUG nova.compute.manager [None req-a43e778a-5369-429c-85e3-1e2ecb4d0013 tempest-ServerAddressesNegativeTestJSON-1242941731 tempest-ServerAddressesNegativeTestJSON-1242941731-project-member] [instance: 41317213-a0f2-42fc-9e44-dfe83d27a811] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1152.267681] env[68964]: DEBUG nova.compute.manager [None req-a43e778a-5369-429c-85e3-1e2ecb4d0013 tempest-ServerAddressesNegativeTestJSON-1242941731 tempest-ServerAddressesNegativeTestJSON-1242941731-project-member] [instance: 41317213-a0f2-42fc-9e44-dfe83d27a811] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1152.288903] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c59bbb14-2076-4957-b47e-51b53a8031f1 tempest-FloatingIPsAssociationNegativeTestJSON-116374462 tempest-FloatingIPsAssociationNegativeTestJSON-116374462-project-member] Lock "e8b202bc-fb75-4cd6-9b12-5c37e0ca06fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.218s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.293102] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a43e778a-5369-429c-85e3-1e2ecb4d0013 tempest-ServerAddressesNegativeTestJSON-1242941731 tempest-ServerAddressesNegativeTestJSON-1242941731-project-member] Lock "41317213-a0f2-42fc-9e44-dfe83d27a811" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.972s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.302892] env[68964]: DEBUG nova.compute.manager [None req-8240fc55-8211-4a42-aa95-a0c9eef58693 tempest-ServerShowV257Test-1966615260 tempest-ServerShowV257Test-1966615260-project-member] [instance: 864ec33b-2840-4ed3-b0b6-2ef062141705] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1152.325204] env[68964]: DEBUG nova.compute.manager [None req-8240fc55-8211-4a42-aa95-a0c9eef58693 tempest-ServerShowV257Test-1966615260 tempest-ServerShowV257Test-1966615260-project-member] [instance: 864ec33b-2840-4ed3-b0b6-2ef062141705] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1152.353147] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8240fc55-8211-4a42-aa95-a0c9eef58693 tempest-ServerShowV257Test-1966615260 tempest-ServerShowV257Test-1966615260-project-member] Lock "864ec33b-2840-4ed3-b0b6-2ef062141705" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.210s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.364805] env[68964]: DEBUG nova.compute.manager [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 57dbb1fb-8a62-4127-ac5c-4d7bf1fa2a52] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1152.396720] env[68964]: DEBUG nova.compute.manager [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 57dbb1fb-8a62-4127-ac5c-4d7bf1fa2a52] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1152.421420] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "57dbb1fb-8a62-4127-ac5c-4d7bf1fa2a52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.699s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.432767] env[68964]: DEBUG nova.compute.manager [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: c7853bb3-fa53-4911-818f-e03245ad3a0c] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1152.457352] env[68964]: DEBUG nova.compute.manager [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: c7853bb3-fa53-4911-818f-e03245ad3a0c] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1152.479265] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b625f035-6d3f-4103-ad0a-7b5f16839c9c tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "c7853bb3-fa53-4911-818f-e03245ad3a0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.723s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.491706] env[68964]: DEBUG nova.compute.manager [None req-998facda-bbd0-4953-b652-1aea56ea8704 tempest-ServerActionsTestOtherB-1759788798 tempest-ServerActionsTestOtherB-1759788798-project-member] [instance: 19f90c65-2865-4fa7-b647-f69fd217e1e4] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1152.517800] env[68964]: DEBUG nova.compute.manager [None req-998facda-bbd0-4953-b652-1aea56ea8704 tempest-ServerActionsTestOtherB-1759788798 tempest-ServerActionsTestOtherB-1759788798-project-member] [instance: 19f90c65-2865-4fa7-b647-f69fd217e1e4] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1152.538465] env[68964]: DEBUG oslo_concurrency.lockutils [None req-998facda-bbd0-4953-b652-1aea56ea8704 tempest-ServerActionsTestOtherB-1759788798 tempest-ServerActionsTestOtherB-1759788798-project-member] Lock "19f90c65-2865-4fa7-b647-f69fd217e1e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.616s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.551114] env[68964]: DEBUG nova.compute.manager [None req-cc24d389-ee17-4a7f-9b89-78a15f8dd133 tempest-TenantUsagesTestJSON-431596998 tempest-TenantUsagesTestJSON-431596998-project-member] [instance: c1efc344-848b-4a98-a20a-57ebdfb5ac8c] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1152.574302] env[68964]: DEBUG nova.compute.manager [None req-cc24d389-ee17-4a7f-9b89-78a15f8dd133 tempest-TenantUsagesTestJSON-431596998 tempest-TenantUsagesTestJSON-431596998-project-member] [instance: c1efc344-848b-4a98-a20a-57ebdfb5ac8c] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1152.597742] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cc24d389-ee17-4a7f-9b89-78a15f8dd133 tempest-TenantUsagesTestJSON-431596998 tempest-TenantUsagesTestJSON-431596998-project-member] Lock "c1efc344-848b-4a98-a20a-57ebdfb5ac8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.364s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1152.607588] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1152.661879] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1152.664088] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1152.664088] env[68964]: INFO nova.compute.claims [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1152.929012] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-490ebf9f-3b95-41e2-811a-5beda045f272 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.936911] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f827da4-2a08-45fe-b72f-63eedcdcc05e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.967790] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7f6bb93-a73d-4bb7-bc25-4858ac1fb7ff {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.976596] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94f8fea1-de49-4cbc-8905-72a6b3c3c541 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1152.988643] env[68964]: DEBUG nova.compute.provider_tree [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1152.997668] env[68964]: DEBUG nova.scheduler.client.report [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1153.012449] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1153.012961] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1153.043751] env[68964]: DEBUG nova.compute.utils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1153.045177] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1153.045350] env[68964]: DEBUG nova.network.neutron [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1153.055603] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1153.102036] env[68964]: DEBUG nova.policy [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8422a3a588c493e9ce8a59c70c22efb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2927dfd3741e4023b5d6c1c837dd6baa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1153.120964] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1153.147232] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1153.147232] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1153.147232] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1153.147232] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1153.147618] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1153.147618] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1153.147618] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1153.148563] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1153.148563] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1153.148563] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1153.148563] env[68964]: DEBUG nova.virt.hardware [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1153.149683] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a94ca6b-62dc-46d4-b381-a0985899088d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.157834] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7f54329-fa12-4ec7-a97e-19c32e64a718 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1153.580469] env[68964]: DEBUG nova.network.neutron [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Successfully created port: 5e40bb1a-6653-4fc9-8377-0fc1ceadfebb {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1154.247751] env[68964]: DEBUG nova.compute.manager [req-6fe40e60-c992-4485-919b-6f59e2a5f4ef req-453c2727-45b0-4bff-a12b-8fd033bee617 service nova] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Received event network-vif-plugged-5e40bb1a-6653-4fc9-8377-0fc1ceadfebb {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1154.248027] env[68964]: DEBUG oslo_concurrency.lockutils [req-6fe40e60-c992-4485-919b-6f59e2a5f4ef req-453c2727-45b0-4bff-a12b-8fd033bee617 service nova] Acquiring lock "244140d1-bf22-415a-b770-05f2fe106149-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1154.248179] env[68964]: DEBUG oslo_concurrency.lockutils [req-6fe40e60-c992-4485-919b-6f59e2a5f4ef req-453c2727-45b0-4bff-a12b-8fd033bee617 service nova] Lock "244140d1-bf22-415a-b770-05f2fe106149-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1154.248345] env[68964]: DEBUG oslo_concurrency.lockutils [req-6fe40e60-c992-4485-919b-6f59e2a5f4ef req-453c2727-45b0-4bff-a12b-8fd033bee617 service nova] Lock "244140d1-bf22-415a-b770-05f2fe106149-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1154.248510] env[68964]: DEBUG nova.compute.manager [req-6fe40e60-c992-4485-919b-6f59e2a5f4ef req-453c2727-45b0-4bff-a12b-8fd033bee617 service nova] [instance: 244140d1-bf22-415a-b770-05f2fe106149] No waiting events found dispatching network-vif-plugged-5e40bb1a-6653-4fc9-8377-0fc1ceadfebb {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1154.248674] env[68964]: WARNING nova.compute.manager [req-6fe40e60-c992-4485-919b-6f59e2a5f4ef req-453c2727-45b0-4bff-a12b-8fd033bee617 service nova] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Received unexpected event network-vif-plugged-5e40bb1a-6653-4fc9-8377-0fc1ceadfebb for instance with vm_state building and task_state spawning. [ 1154.402622] env[68964]: DEBUG nova.network.neutron [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Successfully updated port: 5e40bb1a-6653-4fc9-8377-0fc1ceadfebb {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1154.419133] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquiring lock "refresh_cache-244140d1-bf22-415a-b770-05f2fe106149" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1154.419290] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquired lock "refresh_cache-244140d1-bf22-415a-b770-05f2fe106149" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1154.419451] env[68964]: DEBUG nova.network.neutron [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1154.469028] env[68964]: DEBUG nova.network.neutron [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1154.725526] env[68964]: DEBUG nova.network.neutron [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Updating instance_info_cache with network_info: [{"id": "5e40bb1a-6653-4fc9-8377-0fc1ceadfebb", "address": "fa:16:3e:11:a6:29", "network": {"id": "ef98bb6c-f28b-4b60-b3af-5c9da8a9e469", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1097201124-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2927dfd3741e4023b5d6c1c837dd6baa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6a2e2e51-010f-4535-ba88-433663275996", "external-id": "nsx-vlan-transportzone-915", "segmentation_id": 915, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e40bb1a-66", "ovs_interfaceid": "5e40bb1a-6653-4fc9-8377-0fc1ceadfebb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1154.740028] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Releasing lock "refresh_cache-244140d1-bf22-415a-b770-05f2fe106149" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1154.740028] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Instance network_info: |[{"id": "5e40bb1a-6653-4fc9-8377-0fc1ceadfebb", "address": "fa:16:3e:11:a6:29", "network": {"id": "ef98bb6c-f28b-4b60-b3af-5c9da8a9e469", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1097201124-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2927dfd3741e4023b5d6c1c837dd6baa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6a2e2e51-010f-4535-ba88-433663275996", "external-id": "nsx-vlan-transportzone-915", "segmentation_id": 915, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e40bb1a-66", "ovs_interfaceid": "5e40bb1a-6653-4fc9-8377-0fc1ceadfebb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1154.740028] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:11:a6:29', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6a2e2e51-010f-4535-ba88-433663275996', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5e40bb1a-6653-4fc9-8377-0fc1ceadfebb', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1154.746850] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Creating folder: Project (2927dfd3741e4023b5d6c1c837dd6baa). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1154.748017] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7436d7c2-e4be-4b98-ad3e-59d91d619e41 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.759069] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Created folder: Project (2927dfd3741e4023b5d6c1c837dd6baa) in parent group-v684465. [ 1154.759402] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Creating folder: Instances. Parent ref: group-v684549. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1154.759759] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2082f966-8584-427c-b54e-cc9a83c7115a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.770808] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Created folder: Instances in parent group-v684549. [ 1154.771250] env[68964]: DEBUG oslo.service.loopingcall [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1154.771548] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1154.771858] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-09045721-ab01-4917-b320-2570a7e20e07 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1154.794020] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1154.794020] env[68964]: value = "task-3431644" [ 1154.794020] env[68964]: _type = "Task" [ 1154.794020] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1154.799733] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431644, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1155.300946] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431644, 'name': CreateVM_Task, 'duration_secs': 0.282034} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1155.302054] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1155.302216] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1155.302385] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1155.302703] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1155.302985] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-45a414a1-e648-4078-bf8b-9bd66e6ea062 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1155.307185] env[68964]: DEBUG oslo_vmware.api [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Waiting for the task: (returnval){ [ 1155.307185] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52754c41-590c-749d-d184-78560ecc3235" [ 1155.307185] env[68964]: _type = "Task" [ 1155.307185] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1155.314239] env[68964]: DEBUG oslo_vmware.api [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52754c41-590c-749d-d184-78560ecc3235, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1155.817245] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1155.817525] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1155.817742] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1156.371296] env[68964]: DEBUG nova.compute.manager [req-598cc7c4-6acd-4210-93f2-18495ed8aac6 req-cb65b319-ffa1-4187-a874-d13664c9f3fd service nova] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Received event network-changed-5e40bb1a-6653-4fc9-8377-0fc1ceadfebb {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1156.371296] env[68964]: DEBUG nova.compute.manager [req-598cc7c4-6acd-4210-93f2-18495ed8aac6 req-cb65b319-ffa1-4187-a874-d13664c9f3fd service nova] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Refreshing instance network info cache due to event network-changed-5e40bb1a-6653-4fc9-8377-0fc1ceadfebb. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1156.371296] env[68964]: DEBUG oslo_concurrency.lockutils [req-598cc7c4-6acd-4210-93f2-18495ed8aac6 req-cb65b319-ffa1-4187-a874-d13664c9f3fd service nova] Acquiring lock "refresh_cache-244140d1-bf22-415a-b770-05f2fe106149" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1156.371296] env[68964]: DEBUG oslo_concurrency.lockutils [req-598cc7c4-6acd-4210-93f2-18495ed8aac6 req-cb65b319-ffa1-4187-a874-d13664c9f3fd service nova] Acquired lock "refresh_cache-244140d1-bf22-415a-b770-05f2fe106149" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1156.371296] env[68964]: DEBUG nova.network.neutron [req-598cc7c4-6acd-4210-93f2-18495ed8aac6 req-cb65b319-ffa1-4187-a874-d13664c9f3fd service nova] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Refreshing network info cache for port 5e40bb1a-6653-4fc9-8377-0fc1ceadfebb {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1156.649856] env[68964]: DEBUG nova.network.neutron [req-598cc7c4-6acd-4210-93f2-18495ed8aac6 req-cb65b319-ffa1-4187-a874-d13664c9f3fd service nova] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Updated VIF entry in instance network info cache for port 5e40bb1a-6653-4fc9-8377-0fc1ceadfebb. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1156.650217] env[68964]: DEBUG nova.network.neutron [req-598cc7c4-6acd-4210-93f2-18495ed8aac6 req-cb65b319-ffa1-4187-a874-d13664c9f3fd service nova] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Updating instance_info_cache with network_info: [{"id": "5e40bb1a-6653-4fc9-8377-0fc1ceadfebb", "address": "fa:16:3e:11:a6:29", "network": {"id": "ef98bb6c-f28b-4b60-b3af-5c9da8a9e469", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1097201124-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2927dfd3741e4023b5d6c1c837dd6baa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6a2e2e51-010f-4535-ba88-433663275996", "external-id": "nsx-vlan-transportzone-915", "segmentation_id": 915, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e40bb1a-66", "ovs_interfaceid": "5e40bb1a-6653-4fc9-8377-0fc1ceadfebb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1156.659518] env[68964]: DEBUG oslo_concurrency.lockutils [req-598cc7c4-6acd-4210-93f2-18495ed8aac6 req-cb65b319-ffa1-4187-a874-d13664c9f3fd service nova] Releasing lock "refresh_cache-244140d1-bf22-415a-b770-05f2fe106149" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1157.011351] env[68964]: WARNING oslo_vmware.rw_handles [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1157.011351] env[68964]: ERROR oslo_vmware.rw_handles [ 1157.011710] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/a7461050-b058-4501-b422-99cf76e90fe8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1157.013868] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1157.013980] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Copying Virtual Disk [datastore1] vmware_temp/a7461050-b058-4501-b422-99cf76e90fe8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/a7461050-b058-4501-b422-99cf76e90fe8/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1157.014322] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-361bdc26-0173-4dad-9cac-deb9e1530077 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.022367] env[68964]: DEBUG oslo_vmware.api [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Waiting for the task: (returnval){ [ 1157.022367] env[68964]: value = "task-3431645" [ 1157.022367] env[68964]: _type = "Task" [ 1157.022367] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1157.030709] env[68964]: DEBUG oslo_vmware.api [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Task: {'id': task-3431645, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1157.535030] env[68964]: DEBUG oslo_vmware.exceptions [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1157.535030] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1157.535030] env[68964]: ERROR nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1157.535030] env[68964]: Faults: ['InvalidArgument'] [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] Traceback (most recent call last): [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] yield resources [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] self.driver.spawn(context, instance, image_meta, [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] self._fetch_image_if_missing(context, vi) [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] image_cache(vi, tmp_image_ds_loc) [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] vm_util.copy_virtual_disk( [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] session._wait_for_task(vmdk_copy_task) [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] return self.wait_for_task(task_ref) [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] return evt.wait() [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] result = hub.switch() [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] return self.greenlet.switch() [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1157.535030] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] self.f(*self.args, **self.kw) [ 1157.536192] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1157.536192] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] raise exceptions.translate_fault(task_info.error) [ 1157.536192] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1157.536192] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] Faults: ['InvalidArgument'] [ 1157.536192] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] [ 1157.536192] env[68964]: INFO nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Terminating instance [ 1157.540020] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1157.540020] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1157.540020] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1157.540020] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1157.540020] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-998f26e7-bef8-4eb6-a93d-f281df70aad1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1157.542056] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baf10fdb-52bf-4991-9b2f-3c631723a2ef {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.256870] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1158.256870] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-03546f01-d9d4-4d27-8efb-64c47fbb5ce1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.262014] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1158.262014] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1158.262014] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a35b7532-b3cb-4960-b534-eac8e5f1d488 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.266036] env[68964]: DEBUG oslo_vmware.api [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for the task: (returnval){ [ 1158.266036] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5237f096-c395-0305-5e8b-bb09f4bb8104" [ 1158.266036] env[68964]: _type = "Task" [ 1158.266036] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1158.272216] env[68964]: DEBUG oslo_vmware.api [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5237f096-c395-0305-5e8b-bb09f4bb8104, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1158.321326] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1158.321516] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1158.321699] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Deleting the datastore file [datastore1] 3770333e-4721-424d-ac86-2291c002e99a {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1158.321968] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c5000e33-5c11-4471-a85b-9e5f442a045d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.327824] env[68964]: DEBUG oslo_vmware.api [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Waiting for the task: (returnval){ [ 1158.327824] env[68964]: value = "task-3431647" [ 1158.327824] env[68964]: _type = "Task" [ 1158.327824] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1158.335236] env[68964]: DEBUG oslo_vmware.api [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Task: {'id': task-3431647, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1158.775168] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1158.775527] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Creating directory with path [datastore1] vmware_temp/890570d8-e9d3-4e98-a50c-df684d749d60/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1158.775725] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9c930f07-42bb-4005-ac07-77259f50c133 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.787767] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Created directory with path [datastore1] vmware_temp/890570d8-e9d3-4e98-a50c-df684d749d60/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1158.787951] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Fetch image to [datastore1] vmware_temp/890570d8-e9d3-4e98-a50c-df684d749d60/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1158.788152] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/890570d8-e9d3-4e98-a50c-df684d749d60/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1158.788877] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e28a37b0-cb17-484f-8753-f7a56fac7e2f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.795441] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a64b88d4-0ef0-4a5c-96ca-c04a257ca275 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.804318] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75214943-7571-4cd5-ad37-d1cadc7d6bdd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.837027] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d2e11a6-adf8-4274-b305-7ce9ec110fd2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.845096] env[68964]: DEBUG oslo_vmware.api [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Task: {'id': task-3431647, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079015} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1158.845360] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-72a53c9f-4a46-4e2a-af1c-9324913c5380 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1158.846939] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1158.847133] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1158.847304] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1158.847475] env[68964]: INFO nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Took 1.31 seconds to destroy the instance on the hypervisor. [ 1158.849479] env[68964]: DEBUG nova.compute.claims [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1158.849644] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1158.849848] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1158.869437] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1158.918916] env[68964]: DEBUG oslo_vmware.rw_handles [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/890570d8-e9d3-4e98-a50c-df684d749d60/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1158.980404] env[68964]: DEBUG oslo_vmware.rw_handles [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1158.980404] env[68964]: DEBUG oslo_vmware.rw_handles [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/890570d8-e9d3-4e98-a50c-df684d749d60/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1159.181086] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9049ba0f-05b3-4a38-9b61-40bf0308b634 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.188668] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fa5a7a9-a746-4d6e-bc78-3ebb1276a407 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.219346] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7b585b5-1ee4-430a-b164-471fd6566216 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.226699] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01257e4c-12c7-45f6-8dc3-b448cf948a1a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.239825] env[68964]: DEBUG nova.compute.provider_tree [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1159.251859] env[68964]: DEBUG nova.scheduler.client.report [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1159.267978] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.418s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1159.268583] env[68964]: ERROR nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1159.268583] env[68964]: Faults: ['InvalidArgument'] [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] Traceback (most recent call last): [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] self.driver.spawn(context, instance, image_meta, [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] self._fetch_image_if_missing(context, vi) [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] image_cache(vi, tmp_image_ds_loc) [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] vm_util.copy_virtual_disk( [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] session._wait_for_task(vmdk_copy_task) [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] return self.wait_for_task(task_ref) [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] return evt.wait() [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] result = hub.switch() [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] return self.greenlet.switch() [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] self.f(*self.args, **self.kw) [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] raise exceptions.translate_fault(task_info.error) [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] Faults: ['InvalidArgument'] [ 1159.268583] env[68964]: ERROR nova.compute.manager [instance: 3770333e-4721-424d-ac86-2291c002e99a] [ 1159.269532] env[68964]: DEBUG nova.compute.utils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1159.270926] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Build of instance 3770333e-4721-424d-ac86-2291c002e99a was re-scheduled: A specified parameter was not correct: fileType [ 1159.270926] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1159.271257] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1159.271451] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1159.271621] env[68964]: DEBUG nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1159.271779] env[68964]: DEBUG nova.network.neutron [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1159.704063] env[68964]: DEBUG nova.network.neutron [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1159.716218] env[68964]: INFO nova.compute.manager [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Took 0.44 seconds to deallocate network for instance. [ 1159.836947] env[68964]: INFO nova.scheduler.client.report [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Deleted allocations for instance 3770333e-4721-424d-ac86-2291c002e99a [ 1159.860067] env[68964]: DEBUG oslo_concurrency.lockutils [None req-aa203fad-c7f1-4387-86a9-f0199c799238 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "3770333e-4721-424d-ac86-2291c002e99a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 384.983s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1159.861951] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "3770333e-4721-424d-ac86-2291c002e99a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 188.682s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1159.863746] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Acquiring lock "3770333e-4721-424d-ac86-2291c002e99a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.864020] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "3770333e-4721-424d-ac86-2291c002e99a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.002s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1159.864535] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "3770333e-4721-424d-ac86-2291c002e99a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1159.866740] env[68964]: INFO nova.compute.manager [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Terminating instance [ 1159.868776] env[68964]: DEBUG nova.compute.manager [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1159.868978] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1159.869488] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2f1c7d51-969e-46d2-9b9f-305c6a5b4e6a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.878515] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ee39e03-e266-44fd-8e19-4634e56254f9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.890990] env[68964]: DEBUG nova.compute.manager [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1159.912787] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3770333e-4721-424d-ac86-2291c002e99a could not be found. [ 1159.913042] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1159.913232] env[68964]: INFO nova.compute.manager [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1159.913483] env[68964]: DEBUG oslo.service.loopingcall [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1159.913727] env[68964]: DEBUG nova.compute.manager [-] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1159.913824] env[68964]: DEBUG nova.network.neutron [-] [instance: 3770333e-4721-424d-ac86-2291c002e99a] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1159.942255] env[68964]: DEBUG nova.network.neutron [-] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1159.947337] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.947651] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1159.949613] env[68964]: INFO nova.compute.claims [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1159.953178] env[68964]: INFO nova.compute.manager [-] [instance: 3770333e-4721-424d-ac86-2291c002e99a] Took 0.04 seconds to deallocate network for instance. [ 1160.058423] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a1a06539-5ba9-403b-a091-875a64fa8d81 tempest-InstanceActionsNegativeTestJSON-115861761 tempest-InstanceActionsNegativeTestJSON-115861761-project-member] Lock "3770333e-4721-424d-ac86-2291c002e99a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.197s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1160.251511] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-980b127f-0838-410f-8719-704c09d107c1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.262486] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9443e79c-b10b-4171-a80e-e8b7063fe444 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.295834] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d7b99ee-d9b8-45cf-9a76-bb1bade679d2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.303644] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-048970c1-6c3c-41a6-83e9-071c38eca6a1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.317311] env[68964]: DEBUG nova.compute.provider_tree [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1160.326769] env[68964]: DEBUG nova.scheduler.client.report [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1160.341957] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.394s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1160.342377] env[68964]: DEBUG nova.compute.manager [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1160.385667] env[68964]: DEBUG nova.compute.utils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1160.387894] env[68964]: DEBUG nova.compute.manager [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1160.389037] env[68964]: DEBUG nova.network.neutron [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1160.399906] env[68964]: DEBUG nova.compute.manager [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1160.444552] env[68964]: INFO nova.virt.block_device [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Booting with volume 3d5ebda2-4a33-4090-b35f-fbdd44888d65 at /dev/sda [ 1160.468720] env[68964]: DEBUG nova.policy [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1658b6e2a14b4a95857b90ab860edf08', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ac126dc7d056478bb4048368c7722048', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1160.494298] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-41f1a5e8-1280-4120-bcf2-93d46a751f84 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.503299] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23316a74-71d8-4110-8dd1-06cccb009204 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.536343] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6d1e9081-27aa-4d8e-a36c-a95804bcae78 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.548350] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a004a33-fa7b-4266-877c-d2074c084190 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.584382] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d20748c7-4bce-47e4-b85e-6694e5c25b7c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.591654] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6794b19d-3286-4e9e-b675-69ed17c6659b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.605834] env[68964]: DEBUG nova.virt.block_device [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Updating existing volume attachment record: b63a4d9c-ada1-4d0c-a7af-83af6a2f9866 {{(pid=68964) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1160.890419] env[68964]: DEBUG nova.compute.manager [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1160.891087] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1160.891340] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1160.891672] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1160.891672] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1160.891809] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1160.891950] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1160.892161] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1160.892321] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1160.892493] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1160.892666] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1160.892889] env[68964]: DEBUG nova.virt.hardware [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1160.894340] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8803b42a-f222-437a-933d-7e12dbcdd08a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.904387] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b0c68a0-c03e-400b-b33f-a76fed4274f3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.991509] env[68964]: DEBUG nova.network.neutron [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Successfully created port: 256e20da-4af5-4b2c-ad18-496d09fc80a4 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1161.752213] env[68964]: DEBUG nova.network.neutron [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Successfully updated port: 256e20da-4af5-4b2c-ad18-496d09fc80a4 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1161.767771] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Acquiring lock "refresh_cache-07d247db-b7ca-4b5f-818f-17411296d08f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1161.767771] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Acquired lock "refresh_cache-07d247db-b7ca-4b5f-818f-17411296d08f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1161.767771] env[68964]: DEBUG nova.network.neutron [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1161.833984] env[68964]: DEBUG nova.network.neutron [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1161.892021] env[68964]: DEBUG nova.compute.manager [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Received event network-vif-plugged-256e20da-4af5-4b2c-ad18-496d09fc80a4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1161.892021] env[68964]: DEBUG oslo_concurrency.lockutils [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] Acquiring lock "07d247db-b7ca-4b5f-818f-17411296d08f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1161.892021] env[68964]: DEBUG oslo_concurrency.lockutils [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] Lock "07d247db-b7ca-4b5f-818f-17411296d08f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1161.892021] env[68964]: DEBUG oslo_concurrency.lockutils [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] Lock "07d247db-b7ca-4b5f-818f-17411296d08f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1161.892021] env[68964]: DEBUG nova.compute.manager [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] No waiting events found dispatching network-vif-plugged-256e20da-4af5-4b2c-ad18-496d09fc80a4 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1161.892021] env[68964]: WARNING nova.compute.manager [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Received unexpected event network-vif-plugged-256e20da-4af5-4b2c-ad18-496d09fc80a4 for instance with vm_state building and task_state spawning. [ 1161.892021] env[68964]: DEBUG nova.compute.manager [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Received event network-changed-256e20da-4af5-4b2c-ad18-496d09fc80a4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1161.892021] env[68964]: DEBUG nova.compute.manager [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Refreshing instance network info cache due to event network-changed-256e20da-4af5-4b2c-ad18-496d09fc80a4. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1161.892021] env[68964]: DEBUG oslo_concurrency.lockutils [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] Acquiring lock "refresh_cache-07d247db-b7ca-4b5f-818f-17411296d08f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1162.280087] env[68964]: DEBUG nova.network.neutron [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Updating instance_info_cache with network_info: [{"id": "256e20da-4af5-4b2c-ad18-496d09fc80a4", "address": "fa:16:3e:54:cf:5d", "network": {"id": "5f97c7c0-ce44-4d39-a916-1eebb6485c7e", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1731903982-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ac126dc7d056478bb4048368c7722048", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "98d96b75-ac36-499a-adc2-130c8c1d55ca", "external-id": "nsx-vlan-transportzone-564", "segmentation_id": 564, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap256e20da-4a", "ovs_interfaceid": "256e20da-4af5-4b2c-ad18-496d09fc80a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1162.304041] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Releasing lock "refresh_cache-07d247db-b7ca-4b5f-818f-17411296d08f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1162.304482] env[68964]: DEBUG nova.compute.manager [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Instance network_info: |[{"id": "256e20da-4af5-4b2c-ad18-496d09fc80a4", "address": "fa:16:3e:54:cf:5d", "network": {"id": "5f97c7c0-ce44-4d39-a916-1eebb6485c7e", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1731903982-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ac126dc7d056478bb4048368c7722048", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "98d96b75-ac36-499a-adc2-130c8c1d55ca", "external-id": "nsx-vlan-transportzone-564", "segmentation_id": 564, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap256e20da-4a", "ovs_interfaceid": "256e20da-4af5-4b2c-ad18-496d09fc80a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1162.305040] env[68964]: DEBUG oslo_concurrency.lockutils [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] Acquired lock "refresh_cache-07d247db-b7ca-4b5f-818f-17411296d08f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1162.305352] env[68964]: DEBUG nova.network.neutron [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Refreshing network info cache for port 256e20da-4af5-4b2c-ad18-496d09fc80a4 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1162.308664] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:54:cf:5d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '98d96b75-ac36-499a-adc2-130c8c1d55ca', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '256e20da-4af5-4b2c-ad18-496d09fc80a4', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1162.319440] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Creating folder: Project (ac126dc7d056478bb4048368c7722048). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1162.319440] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0d60c22e-1871-4ef3-a97a-9154194a9dd2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.334144] env[68964]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1162.334311] env[68964]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=68964) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 1162.334679] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Folder already exists: Project (ac126dc7d056478bb4048368c7722048). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1162.334864] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Creating folder: Instances. Parent ref: group-v684530. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1162.335108] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-39356af0-dde2-4bda-8fda-24afb505f666 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.346121] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Created folder: Instances in parent group-v684530. [ 1162.346568] env[68964]: DEBUG oslo.service.loopingcall [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1162.346764] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1162.346954] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f96f3473-d1d1-4323-b441-404deafeecb7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.368126] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1162.368126] env[68964]: value = "task-3431650" [ 1162.368126] env[68964]: _type = "Task" [ 1162.368126] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1162.377296] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431650, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1162.880365] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431650, 'name': CreateVM_Task, 'duration_secs': 0.290568} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1162.880553] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1162.881295] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-684536', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'name': 'volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '07d247db-b7ca-4b5f-818f-17411296d08f', 'attached_at': '', 'detached_at': '', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'serial': '3d5ebda2-4a33-4090-b35f-fbdd44888d65'}, 'delete_on_termination': True, 'attachment_id': 'b63a4d9c-ada1-4d0c-a7af-83af6a2f9866', 'boot_index': 0, 'disk_bus': None, 'mount_device': '/dev/sda', 'device_type': None, 'guest_format': None, 'volume_type': None}], 'swap': None} {{(pid=68964) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1162.881530] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Root volume attach. Driver type: vmdk {{(pid=68964) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1162.882319] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-874be113-c8fd-4489-b6df-64092733c7f5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.890610] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f07fca90-249d-4367-af57-6ae4fa9ae5c2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.896389] env[68964]: DEBUG nova.network.neutron [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Updated VIF entry in instance network info cache for port 256e20da-4af5-4b2c-ad18-496d09fc80a4. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1162.896710] env[68964]: DEBUG nova.network.neutron [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Updating instance_info_cache with network_info: [{"id": "256e20da-4af5-4b2c-ad18-496d09fc80a4", "address": "fa:16:3e:54:cf:5d", "network": {"id": "5f97c7c0-ce44-4d39-a916-1eebb6485c7e", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1731903982-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ac126dc7d056478bb4048368c7722048", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "98d96b75-ac36-499a-adc2-130c8c1d55ca", "external-id": "nsx-vlan-transportzone-564", "segmentation_id": 564, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap256e20da-4a", "ovs_interfaceid": "256e20da-4af5-4b2c-ad18-496d09fc80a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1162.898215] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53bca834-33e3-4f2d-a120-67cfbd56cecd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.904840] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-9ef7f371-00e1-4890-82f5-74e4f6966c67 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.908322] env[68964]: DEBUG oslo_concurrency.lockutils [req-be08321f-5764-4275-9d70-c88d33534e56 req-1893cf87-d5ec-44e4-b0a0-5a750d508208 service nova] Releasing lock "refresh_cache-07d247db-b7ca-4b5f-818f-17411296d08f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1162.913757] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for the task: (returnval){ [ 1162.913757] env[68964]: value = "task-3431651" [ 1162.913757] env[68964]: _type = "Task" [ 1162.913757] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1162.922124] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1163.425206] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 42%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1163.925535] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 54%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1164.426580] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 69%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1164.926644] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 82%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1165.428226] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 97%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1165.932108] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 98%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1166.436522] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 98%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1166.932555] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 98%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1167.431763] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 98%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1167.933313] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task} progress is 99%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1168.239457] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquiring lock "07ea329b-3934-437a-8b44-57045e86c310" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1168.239693] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "07ea329b-3934-437a-8b44-57045e86c310" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1168.433585] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431651, 'name': RelocateVM_Task, 'duration_secs': 5.029767} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1168.433866] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Volume attach. Driver type: vmdk {{(pid=68964) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1168.434097] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-684536', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'name': 'volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '07d247db-b7ca-4b5f-818f-17411296d08f', 'attached_at': '', 'detached_at': '', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'serial': '3d5ebda2-4a33-4090-b35f-fbdd44888d65'} {{(pid=68964) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1168.434845] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ae1a46d-71f1-4064-b980-b8d3b9572a7f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.453364] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c22c690-4147-471d-8aa2-0e5e3e3b5187 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.476556] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Reconfiguring VM instance instance-00000038 to attach disk [datastore1] volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65/volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65.vmdk or device None with type thin {{(pid=68964) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1168.476857] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-cc2cf99d-2124-4162-9276-1a8640b187a5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.496272] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for the task: (returnval){ [ 1168.496272] env[68964]: value = "task-3431652" [ 1168.496272] env[68964]: _type = "Task" [ 1168.496272] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1168.504831] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431652, 'name': ReconfigVM_Task} progress is 6%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1168.635174] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquiring lock "244140d1-bf22-415a-b770-05f2fe106149" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1169.010910] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431652, 'name': ReconfigVM_Task, 'duration_secs': 0.280003} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1169.011694] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Reconfigured VM instance instance-00000038 to attach disk [datastore1] volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65/volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65.vmdk or device None with type thin {{(pid=68964) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1169.018180] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-2aa09101-83f6-4d24-b01a-22c7a8ff8281 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.035064] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for the task: (returnval){ [ 1169.035064] env[68964]: value = "task-3431653" [ 1169.035064] env[68964]: _type = "Task" [ 1169.035064] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1169.045505] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431653, 'name': ReconfigVM_Task} progress is 6%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1169.546713] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431653, 'name': ReconfigVM_Task, 'duration_secs': 0.114432} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1169.546713] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-684536', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'name': 'volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '07d247db-b7ca-4b5f-818f-17411296d08f', 'attached_at': '', 'detached_at': '', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'serial': '3d5ebda2-4a33-4090-b35f-fbdd44888d65'} {{(pid=68964) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1169.546713] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-4dbcf05a-27e4-4ea6-9d50-3312d620095f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.551923] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for the task: (returnval){ [ 1169.551923] env[68964]: value = "task-3431654" [ 1169.551923] env[68964]: _type = "Task" [ 1169.551923] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1169.560678] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431654, 'name': Rename_Task} progress is 5%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1170.061990] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431654, 'name': Rename_Task, 'duration_secs': 0.123828} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1170.063019] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Powering on the VM {{(pid=68964) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1170.063019] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-d6df329e-a615-4ba4-a2cd-e4491339e3bd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.068740] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for the task: (returnval){ [ 1170.068740] env[68964]: value = "task-3431655" [ 1170.068740] env[68964]: _type = "Task" [ 1170.068740] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1170.075873] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431655, 'name': PowerOnVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1170.578503] env[68964]: DEBUG oslo_vmware.api [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431655, 'name': PowerOnVM_Task, 'duration_secs': 0.43804} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1170.578773] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Powered on the VM {{(pid=68964) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1170.578968] env[68964]: INFO nova.compute.manager [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Took 9.69 seconds to spawn the instance on the hypervisor. [ 1170.579231] env[68964]: DEBUG nova.compute.manager [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Checking state {{(pid=68964) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 1170.579992] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca14b478-d17e-449b-83d0-f20ccad1062a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.633017] env[68964]: INFO nova.compute.manager [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Took 10.70 seconds to build instance. [ 1170.645783] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0507ac59-04b6-4014-a5eb-f6d469236cc2 tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "07d247db-b7ca-4b5f-818f-17411296d08f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 143.923s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1170.654946] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1170.717945] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1170.718236] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1170.719834] env[68964]: INFO nova.compute.claims [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1171.011382] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d31ef20-32d4-4ec1-85f5-16aa07161494 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1171.019434] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f101d71-bdf0-4c02-8b60-d39c5984dd78 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1171.051049] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-068a09e2-194d-4919-ae5d-174280f1dbdf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1171.058254] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ba1950e-a155-4c6e-8b8f-217bbc7d7b14 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1171.072210] env[68964]: DEBUG nova.compute.provider_tree [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1171.081038] env[68964]: DEBUG nova.scheduler.client.report [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1171.095026] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1171.095223] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1171.128310] env[68964]: DEBUG nova.compute.utils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1171.129745] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1171.129917] env[68964]: DEBUG nova.network.neutron [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1171.139632] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1171.187527] env[68964]: DEBUG nova.policy [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '07bf4d8c5a194e058ddacc6edc4aa5c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f1d18f1155a04e1a8f5530868df8440a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1171.217890] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1171.242824] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1171.243276] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1171.243462] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1171.243697] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1171.243799] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1171.243949] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1171.244188] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1171.244350] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1171.244518] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1171.244680] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1171.244850] env[68964]: DEBUG nova.virt.hardware [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1171.245833] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-745f039b-d7dc-455c-a45f-673f15a277bd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1171.254497] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-331b39f9-0cc8-41a4-ab10-8c202c0b23eb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1171.609318] env[68964]: DEBUG nova.network.neutron [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Successfully created port: 54ade6e1-3c37-4f4f-9113-4ded65edcfec {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1172.492163] env[68964]: DEBUG nova.compute.manager [req-c4e7c08e-f85f-4a70-b8aa-08e3fcb76654 req-ea007c0c-b785-4f4c-bd93-a47327cdd934 service nova] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Received event network-vif-plugged-54ade6e1-3c37-4f4f-9113-4ded65edcfec {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1172.492163] env[68964]: DEBUG oslo_concurrency.lockutils [req-c4e7c08e-f85f-4a70-b8aa-08e3fcb76654 req-ea007c0c-b785-4f4c-bd93-a47327cdd934 service nova] Acquiring lock "96c1b70b-9a17-46b1-999d-558b85c77d22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1172.492163] env[68964]: DEBUG oslo_concurrency.lockutils [req-c4e7c08e-f85f-4a70-b8aa-08e3fcb76654 req-ea007c0c-b785-4f4c-bd93-a47327cdd934 service nova] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1172.492163] env[68964]: DEBUG oslo_concurrency.lockutils [req-c4e7c08e-f85f-4a70-b8aa-08e3fcb76654 req-ea007c0c-b785-4f4c-bd93-a47327cdd934 service nova] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1172.492163] env[68964]: DEBUG nova.compute.manager [req-c4e7c08e-f85f-4a70-b8aa-08e3fcb76654 req-ea007c0c-b785-4f4c-bd93-a47327cdd934 service nova] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] No waiting events found dispatching network-vif-plugged-54ade6e1-3c37-4f4f-9113-4ded65edcfec {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1172.492163] env[68964]: WARNING nova.compute.manager [req-c4e7c08e-f85f-4a70-b8aa-08e3fcb76654 req-ea007c0c-b785-4f4c-bd93-a47327cdd934 service nova] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Received unexpected event network-vif-plugged-54ade6e1-3c37-4f4f-9113-4ded65edcfec for instance with vm_state building and task_state spawning. [ 1172.681096] env[68964]: DEBUG nova.network.neutron [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Successfully updated port: 54ade6e1-3c37-4f4f-9113-4ded65edcfec {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1172.697249] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquiring lock "refresh_cache-96c1b70b-9a17-46b1-999d-558b85c77d22" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1172.697530] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquired lock "refresh_cache-96c1b70b-9a17-46b1-999d-558b85c77d22" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1172.697530] env[68964]: DEBUG nova.network.neutron [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1172.765465] env[68964]: DEBUG nova.network.neutron [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1172.781194] env[68964]: DEBUG nova.compute.manager [req-b9a944eb-2814-4db5-933a-7ff8b8cf42c9 req-1407c669-2117-4599-85da-ca7056f77bf7 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Received event network-changed-256e20da-4af5-4b2c-ad18-496d09fc80a4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1172.781194] env[68964]: DEBUG nova.compute.manager [req-b9a944eb-2814-4db5-933a-7ff8b8cf42c9 req-1407c669-2117-4599-85da-ca7056f77bf7 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Refreshing instance network info cache due to event network-changed-256e20da-4af5-4b2c-ad18-496d09fc80a4. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1172.781194] env[68964]: DEBUG oslo_concurrency.lockutils [req-b9a944eb-2814-4db5-933a-7ff8b8cf42c9 req-1407c669-2117-4599-85da-ca7056f77bf7 service nova] Acquiring lock "refresh_cache-07d247db-b7ca-4b5f-818f-17411296d08f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1172.781194] env[68964]: DEBUG oslo_concurrency.lockutils [req-b9a944eb-2814-4db5-933a-7ff8b8cf42c9 req-1407c669-2117-4599-85da-ca7056f77bf7 service nova] Acquired lock "refresh_cache-07d247db-b7ca-4b5f-818f-17411296d08f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1172.781194] env[68964]: DEBUG nova.network.neutron [req-b9a944eb-2814-4db5-933a-7ff8b8cf42c9 req-1407c669-2117-4599-85da-ca7056f77bf7 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Refreshing network info cache for port 256e20da-4af5-4b2c-ad18-496d09fc80a4 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1173.106485] env[68964]: DEBUG nova.network.neutron [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Updating instance_info_cache with network_info: [{"id": "54ade6e1-3c37-4f4f-9113-4ded65edcfec", "address": "fa:16:3e:68:e4:cc", "network": {"id": "ea32d04b-a10b-44a5-b9f6-c9443190bdb3", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1889065912-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f1d18f1155a04e1a8f5530868df8440a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1768af3d-3317-4ef5-b484-0c2707d63de7", "external-id": "nsx-vlan-transportzone-706", "segmentation_id": 706, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54ade6e1-3c", "ovs_interfaceid": "54ade6e1-3c37-4f4f-9113-4ded65edcfec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1173.118499] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Releasing lock "refresh_cache-96c1b70b-9a17-46b1-999d-558b85c77d22" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1173.118499] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Instance network_info: |[{"id": "54ade6e1-3c37-4f4f-9113-4ded65edcfec", "address": "fa:16:3e:68:e4:cc", "network": {"id": "ea32d04b-a10b-44a5-b9f6-c9443190bdb3", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1889065912-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f1d18f1155a04e1a8f5530868df8440a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1768af3d-3317-4ef5-b484-0c2707d63de7", "external-id": "nsx-vlan-transportzone-706", "segmentation_id": 706, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54ade6e1-3c", "ovs_interfaceid": "54ade6e1-3c37-4f4f-9113-4ded65edcfec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1173.119361] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:68:e4:cc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1768af3d-3317-4ef5-b484-0c2707d63de7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '54ade6e1-3c37-4f4f-9113-4ded65edcfec', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1173.127161] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Creating folder: Project (f1d18f1155a04e1a8f5530868df8440a). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1173.127873] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-94d6c128-4a3b-4cf5-997d-f8e5523de508 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.142756] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Created folder: Project (f1d18f1155a04e1a8f5530868df8440a) in parent group-v684465. [ 1173.143413] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Creating folder: Instances. Parent ref: group-v684554. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1173.143598] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-08d8bcb7-5860-4ad0-8805-a6acd261cc6a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.156803] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Created folder: Instances in parent group-v684554. [ 1173.156803] env[68964]: DEBUG oslo.service.loopingcall [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1173.156803] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1173.156803] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-abce5f08-39bc-4f18-81aa-eb373305d966 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.188333] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1173.188333] env[68964]: value = "task-3431658" [ 1173.188333] env[68964]: _type = "Task" [ 1173.188333] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1173.196672] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431658, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1173.295938] env[68964]: DEBUG nova.network.neutron [req-b9a944eb-2814-4db5-933a-7ff8b8cf42c9 req-1407c669-2117-4599-85da-ca7056f77bf7 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Updated VIF entry in instance network info cache for port 256e20da-4af5-4b2c-ad18-496d09fc80a4. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1173.296322] env[68964]: DEBUG nova.network.neutron [req-b9a944eb-2814-4db5-933a-7ff8b8cf42c9 req-1407c669-2117-4599-85da-ca7056f77bf7 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Updating instance_info_cache with network_info: [{"id": "256e20da-4af5-4b2c-ad18-496d09fc80a4", "address": "fa:16:3e:54:cf:5d", "network": {"id": "5f97c7c0-ce44-4d39-a916-1eebb6485c7e", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1731903982-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ac126dc7d056478bb4048368c7722048", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "98d96b75-ac36-499a-adc2-130c8c1d55ca", "external-id": "nsx-vlan-transportzone-564", "segmentation_id": 564, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap256e20da-4a", "ovs_interfaceid": "256e20da-4af5-4b2c-ad18-496d09fc80a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1173.311433] env[68964]: DEBUG oslo_concurrency.lockutils [req-b9a944eb-2814-4db5-933a-7ff8b8cf42c9 req-1407c669-2117-4599-85da-ca7056f77bf7 service nova] Releasing lock "refresh_cache-07d247db-b7ca-4b5f-818f-17411296d08f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1173.699249] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431658, 'name': CreateVM_Task, 'duration_secs': 0.291069} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1173.699508] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1173.700100] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1173.700266] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1173.700567] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1173.700813] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d11bf6f6-cc7b-43d7-bb1b-7675eac1a98d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.705216] env[68964]: DEBUG oslo_vmware.api [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Waiting for the task: (returnval){ [ 1173.705216] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52aed19a-2536-adc6-ef4b-cb32691d19e4" [ 1173.705216] env[68964]: _type = "Task" [ 1173.705216] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1173.712553] env[68964]: DEBUG oslo_vmware.api [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52aed19a-2536-adc6-ef4b-cb32691d19e4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1174.216393] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1174.216647] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1174.216857] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1174.545384] env[68964]: DEBUG nova.compute.manager [req-14c6daa7-7fd0-4547-8b3d-d2cf00f3bb01 req-d11aa29f-33a9-4bca-bb2c-506ca75bba14 service nova] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Received event network-changed-54ade6e1-3c37-4f4f-9113-4ded65edcfec {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1174.545589] env[68964]: DEBUG nova.compute.manager [req-14c6daa7-7fd0-4547-8b3d-d2cf00f3bb01 req-d11aa29f-33a9-4bca-bb2c-506ca75bba14 service nova] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Refreshing instance network info cache due to event network-changed-54ade6e1-3c37-4f4f-9113-4ded65edcfec. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1174.545796] env[68964]: DEBUG oslo_concurrency.lockutils [req-14c6daa7-7fd0-4547-8b3d-d2cf00f3bb01 req-d11aa29f-33a9-4bca-bb2c-506ca75bba14 service nova] Acquiring lock "refresh_cache-96c1b70b-9a17-46b1-999d-558b85c77d22" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1174.545936] env[68964]: DEBUG oslo_concurrency.lockutils [req-14c6daa7-7fd0-4547-8b3d-d2cf00f3bb01 req-d11aa29f-33a9-4bca-bb2c-506ca75bba14 service nova] Acquired lock "refresh_cache-96c1b70b-9a17-46b1-999d-558b85c77d22" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1174.546108] env[68964]: DEBUG nova.network.neutron [req-14c6daa7-7fd0-4547-8b3d-d2cf00f3bb01 req-d11aa29f-33a9-4bca-bb2c-506ca75bba14 service nova] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Refreshing network info cache for port 54ade6e1-3c37-4f4f-9113-4ded65edcfec {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1174.904733] env[68964]: DEBUG nova.network.neutron [req-14c6daa7-7fd0-4547-8b3d-d2cf00f3bb01 req-d11aa29f-33a9-4bca-bb2c-506ca75bba14 service nova] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Updated VIF entry in instance network info cache for port 54ade6e1-3c37-4f4f-9113-4ded65edcfec. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1174.905193] env[68964]: DEBUG nova.network.neutron [req-14c6daa7-7fd0-4547-8b3d-d2cf00f3bb01 req-d11aa29f-33a9-4bca-bb2c-506ca75bba14 service nova] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Updating instance_info_cache with network_info: [{"id": "54ade6e1-3c37-4f4f-9113-4ded65edcfec", "address": "fa:16:3e:68:e4:cc", "network": {"id": "ea32d04b-a10b-44a5-b9f6-c9443190bdb3", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1889065912-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f1d18f1155a04e1a8f5530868df8440a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1768af3d-3317-4ef5-b484-0c2707d63de7", "external-id": "nsx-vlan-transportzone-706", "segmentation_id": 706, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54ade6e1-3c", "ovs_interfaceid": "54ade6e1-3c37-4f4f-9113-4ded65edcfec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1174.915183] env[68964]: DEBUG oslo_concurrency.lockutils [req-14c6daa7-7fd0-4547-8b3d-d2cf00f3bb01 req-d11aa29f-33a9-4bca-bb2c-506ca75bba14 service nova] Releasing lock "refresh_cache-96c1b70b-9a17-46b1-999d-558b85c77d22" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1188.906984] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Acquiring lock "07d247db-b7ca-4b5f-818f-17411296d08f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1188.907316] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "07d247db-b7ca-4b5f-818f-17411296d08f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1188.907579] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Acquiring lock "07d247db-b7ca-4b5f-818f-17411296d08f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1188.907579] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "07d247db-b7ca-4b5f-818f-17411296d08f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1188.908474] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "07d247db-b7ca-4b5f-818f-17411296d08f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1188.911381] env[68964]: INFO nova.compute.manager [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Terminating instance [ 1188.913281] env[68964]: DEBUG nova.compute.manager [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1188.913486] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Powering off the VM {{(pid=68964) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1188.913910] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-041eb61f-4e22-49df-9244-da8125d7c902 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.920740] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for the task: (returnval){ [ 1188.920740] env[68964]: value = "task-3431659" [ 1188.920740] env[68964]: _type = "Task" [ 1188.920740] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1188.929471] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431659, 'name': PowerOffVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1189.430535] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431659, 'name': PowerOffVM_Task, 'duration_secs': 0.230841} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1189.430822] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Powered off the VM {{(pid=68964) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 1189.431028] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Volume detach. Driver type: vmdk {{(pid=68964) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1189.431213] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-684536', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'name': 'volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '07d247db-b7ca-4b5f-818f-17411296d08f', 'attached_at': '', 'detached_at': '', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'serial': '3d5ebda2-4a33-4090-b35f-fbdd44888d65'} {{(pid=68964) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1189.431959] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a559b4c-a2ea-4768-a17a-3a8cbfcd786e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.449701] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0812aa68-aeff-49d0-b0e6-ec5e9076e935 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.455855] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d0f1aa1-95cd-44fe-9060-ec9aab72eed9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.472637] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a66067cc-6c20-47da-9360-f70f161aa3bb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.486286] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] The volume has not been displaced from its original location: [datastore1] volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65/volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65.vmdk. No consolidation needed. {{(pid=68964) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 1189.491378] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Reconfiguring VM instance instance-00000038 to detach disk 2000 {{(pid=68964) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 1189.491624] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-91a5494a-a9df-4eaf-be95-825738964329 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.508666] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for the task: (returnval){ [ 1189.508666] env[68964]: value = "task-3431660" [ 1189.508666] env[68964]: _type = "Task" [ 1189.508666] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1189.515878] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431660, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1190.018714] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431660, 'name': ReconfigVM_Task, 'duration_secs': 0.149766} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1190.019262] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Reconfigured VM instance instance-00000038 to detach disk 2000 {{(pid=68964) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 1190.023880] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-203bb21e-5707-48ed-b354-0ca8e5b2d55c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.038533] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for the task: (returnval){ [ 1190.038533] env[68964]: value = "task-3431661" [ 1190.038533] env[68964]: _type = "Task" [ 1190.038533] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1190.047831] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431661, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1190.549057] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431661, 'name': ReconfigVM_Task, 'duration_secs': 0.138099} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1190.549349] env[68964]: DEBUG nova.virt.vmwareapi.volumeops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-684536', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'name': 'volume-3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '07d247db-b7ca-4b5f-818f-17411296d08f', 'attached_at': '', 'detached_at': '', 'volume_id': '3d5ebda2-4a33-4090-b35f-fbdd44888d65', 'serial': '3d5ebda2-4a33-4090-b35f-fbdd44888d65'} {{(pid=68964) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 1190.549635] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1190.550509] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9ebc92e-9a84-4791-9dc0-ce4159357ead {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.556933] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1190.557131] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-658fcc20-9625-47de-a308-42d81afe5b5b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.613313] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1190.613524] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1190.613701] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Deleting the datastore file [datastore1] 07d247db-b7ca-4b5f-818f-17411296d08f {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1190.613992] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b6f62563-1042-4c8b-a4ec-b518bbd93a3c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.620292] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for the task: (returnval){ [ 1190.620292] env[68964]: value = "task-3431663" [ 1190.620292] env[68964]: _type = "Task" [ 1190.620292] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1190.629398] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431663, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1191.130307] env[68964]: DEBUG oslo_vmware.api [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Task: {'id': task-3431663, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081387} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1191.130608] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1191.130732] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1191.130931] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1191.131117] env[68964]: INFO nova.compute.manager [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Took 2.22 seconds to destroy the instance on the hypervisor. [ 1191.131351] env[68964]: DEBUG oslo.service.loopingcall [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1191.131538] env[68964]: DEBUG nova.compute.manager [-] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1191.131632] env[68964]: DEBUG nova.network.neutron [-] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1192.060904] env[68964]: DEBUG nova.network.neutron [-] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1192.075943] env[68964]: INFO nova.compute.manager [-] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Took 0.94 seconds to deallocate network for instance. [ 1192.153226] env[68964]: INFO nova.compute.manager [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Took 0.08 seconds to detach 1 volumes for instance. [ 1192.157819] env[68964]: DEBUG nova.compute.manager [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Deleting volume: 3d5ebda2-4a33-4090-b35f-fbdd44888d65 {{(pid=68964) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3221}} [ 1192.195266] env[68964]: DEBUG nova.compute.manager [req-048c1b23-cf71-4194-88db-b50b0d264327 req-ac85dc51-040e-4b8e-9e44-a9bf1375a428 service nova] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Received event network-vif-deleted-256e20da-4af5-4b2c-ad18-496d09fc80a4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1192.252650] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1192.252934] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1192.254796] env[68964]: DEBUG nova.objects.instance [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lazy-loading 'resources' on Instance uuid 07d247db-b7ca-4b5f-818f-17411296d08f {{(pid=68964) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 1192.571352] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70b617cb-52f5-4684-8190-3edac8a37167 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1192.582099] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52dc4714-7e25-48d3-9da5-a0f817b2011b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1192.612083] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c47ad3dc-bd13-480e-a1e2-83a85e69407a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1192.619889] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-431318cc-0e15-4c97-a4e6-a1090be4799d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1192.636989] env[68964]: DEBUG nova.compute.provider_tree [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1192.650603] env[68964]: DEBUG nova.scheduler.client.report [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1192.665051] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.411s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1192.696287] env[68964]: INFO nova.scheduler.client.report [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Deleted allocations for instance 07d247db-b7ca-4b5f-818f-17411296d08f [ 1192.749015] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3fbc72d6-ac71-4453-b76c-009a5daf5ecf tempest-ServersTestBootFromVolume-806300798 tempest-ServersTestBootFromVolume-806300798-project-member] Lock "07d247db-b7ca-4b5f-818f-17411296d08f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.841s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1193.067285] env[68964]: WARNING oslo_vmware.rw_handles [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1193.067285] env[68964]: ERROR oslo_vmware.rw_handles [ 1193.067682] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/3b65a337-b5cd-4422-9e27-5f2f1b02672e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1193.070393] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1193.070661] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Copying Virtual Disk [datastore2] vmware_temp/3b65a337-b5cd-4422-9e27-5f2f1b02672e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/3b65a337-b5cd-4422-9e27-5f2f1b02672e/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1193.071012] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1ede1841-4f9e-443a-ab2a-1e16a133cef3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.078986] env[68964]: DEBUG oslo_vmware.api [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for the task: (returnval){ [ 1193.078986] env[68964]: value = "task-3431665" [ 1193.078986] env[68964]: _type = "Task" [ 1193.078986] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1193.087285] env[68964]: DEBUG oslo_vmware.api [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': task-3431665, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1193.589907] env[68964]: DEBUG oslo_vmware.exceptions [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1193.590266] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1193.590779] env[68964]: ERROR nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1193.590779] env[68964]: Faults: ['InvalidArgument'] [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Traceback (most recent call last): [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] yield resources [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] self.driver.spawn(context, instance, image_meta, [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] self._fetch_image_if_missing(context, vi) [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] image_cache(vi, tmp_image_ds_loc) [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] vm_util.copy_virtual_disk( [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] session._wait_for_task(vmdk_copy_task) [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] return self.wait_for_task(task_ref) [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] return evt.wait() [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] result = hub.switch() [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] return self.greenlet.switch() [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] self.f(*self.args, **self.kw) [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] raise exceptions.translate_fault(task_info.error) [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Faults: ['InvalidArgument'] [ 1193.590779] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] [ 1193.591804] env[68964]: INFO nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Terminating instance [ 1193.592603] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1193.592811] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1193.593646] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1193.593737] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1193.593950] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3e690367-045a-4d09-8838-51321d8444e6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.596307] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f90db67-1670-419d-8cf1-fa52bb23d31b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.603628] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1193.603857] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f3510597-0296-4e9e-957a-1dc425bc8556 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.606085] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1193.606265] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1193.607197] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f206c23-85c5-49ae-98fb-5edc76746b9a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.611957] env[68964]: DEBUG oslo_vmware.api [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Waiting for the task: (returnval){ [ 1193.611957] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a62d4f-6f70-6fca-54fe-1196a2885bbd" [ 1193.611957] env[68964]: _type = "Task" [ 1193.611957] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1193.625112] env[68964]: DEBUG oslo_vmware.api [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a62d4f-6f70-6fca-54fe-1196a2885bbd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1193.671415] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1193.671631] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1193.671807] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Deleting the datastore file [datastore2] ff09bdbb-84e3-4182-8118-e99512a0e9de {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1193.672083] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-90589e26-62cd-49af-82af-aa5b11971b50 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1193.677611] env[68964]: DEBUG oslo_vmware.api [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for the task: (returnval){ [ 1193.677611] env[68964]: value = "task-3431667" [ 1193.677611] env[68964]: _type = "Task" [ 1193.677611] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1193.685269] env[68964]: DEBUG oslo_vmware.api [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': task-3431667, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1194.122081] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1194.122347] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Creating directory with path [datastore2] vmware_temp/97b511ec-c073-4468-b3e6-cdb9ac1ddc68/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1194.122584] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a1399c9e-9f17-4826-b42f-dc31db48b007 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.134023] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Created directory with path [datastore2] vmware_temp/97b511ec-c073-4468-b3e6-cdb9ac1ddc68/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1194.134227] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Fetch image to [datastore2] vmware_temp/97b511ec-c073-4468-b3e6-cdb9ac1ddc68/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1194.134395] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/97b511ec-c073-4468-b3e6-cdb9ac1ddc68/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1194.135150] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4bb6bef-42b9-4a9a-97ee-1d68d6251591 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.142073] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad215e90-b79c-423d-a454-fcdb7e2301ad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.151164] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d2020ac-9124-4d91-85a9-b8e32dfe1f57 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.192875] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc5de27a-813b-415f-b7cb-9f937b27a9c2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.200224] env[68964]: DEBUG oslo_vmware.api [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Task: {'id': task-3431667, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064268} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1194.201738] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1194.201936] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1194.202120] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1194.202295] env[68964]: INFO nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1194.208061] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-eacf5f06-cd04-417b-a0be-bd80245be4a6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.208871] env[68964]: DEBUG nova.compute.claims [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1194.209337] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1194.209763] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1194.228031] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1194.293745] env[68964]: DEBUG oslo_vmware.rw_handles [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/97b511ec-c073-4468-b3e6-cdb9ac1ddc68/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1194.358102] env[68964]: DEBUG oslo_vmware.rw_handles [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1194.358378] env[68964]: DEBUG oslo_vmware.rw_handles [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/97b511ec-c073-4468-b3e6-cdb9ac1ddc68/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1194.578626] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bca41132-75d2-4ea0-a65e-34ae57cf9d5d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.587137] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3c7eab9-e8f6-4173-9637-a1396994149b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.618980] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a92af310-a62b-466f-bd34-13f301f18851 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.626978] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9aaf5c3-3995-4ac7-af6c-c538c9af4de0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1194.641552] env[68964]: DEBUG nova.compute.provider_tree [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1194.652093] env[68964]: DEBUG nova.scheduler.client.report [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1194.667411] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.458s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1194.667932] env[68964]: ERROR nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1194.667932] env[68964]: Faults: ['InvalidArgument'] [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Traceback (most recent call last): [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] self.driver.spawn(context, instance, image_meta, [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] self._fetch_image_if_missing(context, vi) [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] image_cache(vi, tmp_image_ds_loc) [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] vm_util.copy_virtual_disk( [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] session._wait_for_task(vmdk_copy_task) [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] return self.wait_for_task(task_ref) [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] return evt.wait() [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] result = hub.switch() [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] return self.greenlet.switch() [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] self.f(*self.args, **self.kw) [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] raise exceptions.translate_fault(task_info.error) [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Faults: ['InvalidArgument'] [ 1194.667932] env[68964]: ERROR nova.compute.manager [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] [ 1194.670071] env[68964]: DEBUG nova.compute.utils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1194.670071] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Build of instance ff09bdbb-84e3-4182-8118-e99512a0e9de was re-scheduled: A specified parameter was not correct: fileType [ 1194.670071] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1194.670381] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1194.670545] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1194.670706] env[68964]: DEBUG nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1194.670861] env[68964]: DEBUG nova.network.neutron [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1195.024714] env[68964]: DEBUG nova.network.neutron [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1195.042060] env[68964]: INFO nova.compute.manager [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Took 0.37 seconds to deallocate network for instance. [ 1195.146426] env[68964]: INFO nova.scheduler.client.report [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Deleted allocations for instance ff09bdbb-84e3-4182-8118-e99512a0e9de [ 1195.168507] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7e296380-9d5a-446d-9b70-d11bd34e397e tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "ff09bdbb-84e3-4182-8118-e99512a0e9de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 513.016s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1195.172143] env[68964]: DEBUG oslo_concurrency.lockutils [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "ff09bdbb-84e3-4182-8118-e99512a0e9de" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 180.541s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1195.172377] env[68964]: DEBUG oslo_concurrency.lockutils [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Acquiring lock "ff09bdbb-84e3-4182-8118-e99512a0e9de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1195.172709] env[68964]: DEBUG oslo_concurrency.lockutils [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "ff09bdbb-84e3-4182-8118-e99512a0e9de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1195.173087] env[68964]: DEBUG oslo_concurrency.lockutils [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "ff09bdbb-84e3-4182-8118-e99512a0e9de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1195.175355] env[68964]: INFO nova.compute.manager [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Terminating instance [ 1195.177083] env[68964]: DEBUG nova.compute.manager [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1195.177271] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1195.178251] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3ca5fb10-d0b8-40ba-a850-ecb867847bd2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.185677] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1195.191396] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88d6b04c-e799-4433-a061-47864e2e5f39 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.224588] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ff09bdbb-84e3-4182-8118-e99512a0e9de could not be found. [ 1195.224588] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1195.224588] env[68964]: INFO nova.compute.manager [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1195.224788] env[68964]: DEBUG oslo.service.loopingcall [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1195.225063] env[68964]: DEBUG nova.compute.manager [-] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1195.225167] env[68964]: DEBUG nova.network.neutron [-] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1195.256016] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1195.256256] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1195.257843] env[68964]: INFO nova.compute.claims [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1195.262368] env[68964]: DEBUG nova.network.neutron [-] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1195.271198] env[68964]: INFO nova.compute.manager [-] [instance: ff09bdbb-84e3-4182-8118-e99512a0e9de] Took 0.05 seconds to deallocate network for instance. [ 1195.382091] env[68964]: DEBUG oslo_concurrency.lockutils [None req-33f87e01-25ff-46da-a930-2a46fc5c53e7 tempest-ServersAdminTestJSON-2005871541 tempest-ServersAdminTestJSON-2005871541-project-member] Lock "ff09bdbb-84e3-4182-8118-e99512a0e9de" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.210s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1195.597082] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f0e44f7-9ee5-4f99-a5e5-8f1d159453ac {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.603416] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d6acc3f-337d-4815-9c27-db2e4046a3ff {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.636658] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd1b8a67-345b-4129-930b-cb1bc223ae56 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.645734] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71269cb8-8b9d-48c0-a072-6d195eee550f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.662403] env[68964]: DEBUG nova.compute.provider_tree [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1195.674210] env[68964]: DEBUG nova.scheduler.client.report [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1195.694142] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.436s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1195.694142] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1195.727370] env[68964]: DEBUG nova.compute.utils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1195.727370] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1195.727370] env[68964]: DEBUG nova.network.neutron [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1195.737138] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1195.812279] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1195.839355] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1195.839814] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1195.840109] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1195.840425] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1195.840694] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1195.840964] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1195.841662] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1195.841970] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1195.842284] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1195.842647] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1195.842987] env[68964]: DEBUG nova.virt.hardware [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1195.843966] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c81ec0f-e2ca-46c6-8857-be4bb4872840 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.852704] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11a2da45-d0d0-48cc-b263-78598e4a29f1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.858452] env[68964]: DEBUG nova.policy [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21b5b62c1d9a4afc8e26b122ce6de51c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '07b4913b8fef4ee3a0d920bc36fefd18', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1196.545858] env[68964]: DEBUG nova.network.neutron [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Successfully created port: 6bd95ed4-c7eb-40a2-858b-212ccde0ba65 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1197.656124] env[68964]: DEBUG nova.compute.manager [req-cca1d8bd-f07c-4419-958c-26926e3f2874 req-7e077937-98d5-4d50-ad13-b0bad3f7196c service nova] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Received event network-vif-plugged-6bd95ed4-c7eb-40a2-858b-212ccde0ba65 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1197.656124] env[68964]: DEBUG oslo_concurrency.lockutils [req-cca1d8bd-f07c-4419-958c-26926e3f2874 req-7e077937-98d5-4d50-ad13-b0bad3f7196c service nova] Acquiring lock "a317d842-0282-4ace-a457-d8031cf0adca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1197.656124] env[68964]: DEBUG oslo_concurrency.lockutils [req-cca1d8bd-f07c-4419-958c-26926e3f2874 req-7e077937-98d5-4d50-ad13-b0bad3f7196c service nova] Lock "a317d842-0282-4ace-a457-d8031cf0adca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1197.656124] env[68964]: DEBUG oslo_concurrency.lockutils [req-cca1d8bd-f07c-4419-958c-26926e3f2874 req-7e077937-98d5-4d50-ad13-b0bad3f7196c service nova] Lock "a317d842-0282-4ace-a457-d8031cf0adca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1197.656124] env[68964]: DEBUG nova.compute.manager [req-cca1d8bd-f07c-4419-958c-26926e3f2874 req-7e077937-98d5-4d50-ad13-b0bad3f7196c service nova] [instance: a317d842-0282-4ace-a457-d8031cf0adca] No waiting events found dispatching network-vif-plugged-6bd95ed4-c7eb-40a2-858b-212ccde0ba65 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1197.656124] env[68964]: WARNING nova.compute.manager [req-cca1d8bd-f07c-4419-958c-26926e3f2874 req-7e077937-98d5-4d50-ad13-b0bad3f7196c service nova] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Received unexpected event network-vif-plugged-6bd95ed4-c7eb-40a2-858b-212ccde0ba65 for instance with vm_state building and task_state spawning. [ 1197.748730] env[68964]: DEBUG nova.network.neutron [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Successfully updated port: 6bd95ed4-c7eb-40a2-858b-212ccde0ba65 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1197.765304] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "refresh_cache-a317d842-0282-4ace-a457-d8031cf0adca" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1197.765534] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "refresh_cache-a317d842-0282-4ace-a457-d8031cf0adca" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1197.765652] env[68964]: DEBUG nova.network.neutron [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1198.078121] env[68964]: DEBUG nova.network.neutron [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1198.459804] env[68964]: DEBUG nova.network.neutron [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Updating instance_info_cache with network_info: [{"id": "6bd95ed4-c7eb-40a2-858b-212ccde0ba65", "address": "fa:16:3e:78:27:a8", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6bd95ed4-c7", "ovs_interfaceid": "6bd95ed4-c7eb-40a2-858b-212ccde0ba65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1198.478105] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "refresh_cache-a317d842-0282-4ace-a457-d8031cf0adca" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1198.481931] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Instance network_info: |[{"id": "6bd95ed4-c7eb-40a2-858b-212ccde0ba65", "address": "fa:16:3e:78:27:a8", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6bd95ed4-c7", "ovs_interfaceid": "6bd95ed4-c7eb-40a2-858b-212ccde0ba65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1198.481931] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:27:a8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92fe29b3-0907-453d-aabb-5559c4bd7c0f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6bd95ed4-c7eb-40a2-858b-212ccde0ba65', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1198.490929] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating folder: Project (07b4913b8fef4ee3a0d920bc36fefd18). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1198.492269] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0c9acdc8-8ab8-41db-a174-959c3ea7358f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.505018] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Created folder: Project (07b4913b8fef4ee3a0d920bc36fefd18) in parent group-v684465. [ 1198.505018] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating folder: Instances. Parent ref: group-v684557. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1198.505018] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f0b1eae8-9382-4fe5-bdb2-571355e63c70 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.512278] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Created folder: Instances in parent group-v684557. [ 1198.512984] env[68964]: DEBUG oslo.service.loopingcall [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1198.513310] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1198.513605] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-16ff7ebc-9ee0-4b40-89e9-9e44430494be {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1198.536413] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1198.536413] env[68964]: value = "task-3431670" [ 1198.536413] env[68964]: _type = "Task" [ 1198.536413] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1198.545282] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431670, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1199.049215] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431670, 'name': CreateVM_Task, 'duration_secs': 0.321229} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1199.049496] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1199.050082] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1199.050240] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1199.050556] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1199.050858] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-12b2131e-d98f-4239-adc8-9d2d4c1d85e1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.057799] env[68964]: DEBUG oslo_vmware.api [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 1199.057799] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]520e676b-175f-71a9-9afe-dc0ecd0726d8" [ 1199.057799] env[68964]: _type = "Task" [ 1199.057799] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1199.066717] env[68964]: DEBUG oslo_vmware.api [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]520e676b-175f-71a9-9afe-dc0ecd0726d8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1199.570069] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1199.570375] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1199.570591] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1200.110869] env[68964]: DEBUG nova.compute.manager [req-864006bb-d17d-45fe-b62e-94b99b445cb3 req-4ae3cc88-9995-438b-8e8b-93ecee656147 service nova] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Received event network-changed-6bd95ed4-c7eb-40a2-858b-212ccde0ba65 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1200.110869] env[68964]: DEBUG nova.compute.manager [req-864006bb-d17d-45fe-b62e-94b99b445cb3 req-4ae3cc88-9995-438b-8e8b-93ecee656147 service nova] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Refreshing instance network info cache due to event network-changed-6bd95ed4-c7eb-40a2-858b-212ccde0ba65. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1200.111154] env[68964]: DEBUG oslo_concurrency.lockutils [req-864006bb-d17d-45fe-b62e-94b99b445cb3 req-4ae3cc88-9995-438b-8e8b-93ecee656147 service nova] Acquiring lock "refresh_cache-a317d842-0282-4ace-a457-d8031cf0adca" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1200.112646] env[68964]: DEBUG oslo_concurrency.lockutils [req-864006bb-d17d-45fe-b62e-94b99b445cb3 req-4ae3cc88-9995-438b-8e8b-93ecee656147 service nova] Acquired lock "refresh_cache-a317d842-0282-4ace-a457-d8031cf0adca" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1200.112967] env[68964]: DEBUG nova.network.neutron [req-864006bb-d17d-45fe-b62e-94b99b445cb3 req-4ae3cc88-9995-438b-8e8b-93ecee656147 service nova] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Refreshing network info cache for port 6bd95ed4-c7eb-40a2-858b-212ccde0ba65 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1200.712342] env[68964]: DEBUG nova.network.neutron [req-864006bb-d17d-45fe-b62e-94b99b445cb3 req-4ae3cc88-9995-438b-8e8b-93ecee656147 service nova] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Updated VIF entry in instance network info cache for port 6bd95ed4-c7eb-40a2-858b-212ccde0ba65. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1200.712934] env[68964]: DEBUG nova.network.neutron [req-864006bb-d17d-45fe-b62e-94b99b445cb3 req-4ae3cc88-9995-438b-8e8b-93ecee656147 service nova] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Updating instance_info_cache with network_info: [{"id": "6bd95ed4-c7eb-40a2-858b-212ccde0ba65", "address": "fa:16:3e:78:27:a8", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6bd95ed4-c7", "ovs_interfaceid": "6bd95ed4-c7eb-40a2-858b-212ccde0ba65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1200.735796] env[68964]: DEBUG oslo_concurrency.lockutils [req-864006bb-d17d-45fe-b62e-94b99b445cb3 req-4ae3cc88-9995-438b-8e8b-93ecee656147 service nova] Releasing lock "refresh_cache-a317d842-0282-4ace-a457-d8031cf0adca" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1202.802643] env[68964]: DEBUG oslo_concurrency.lockutils [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] Acquiring lock "c7e9acc0-1427-4382-bcf8-99fdcc08aac0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1202.803302] env[68964]: DEBUG oslo_concurrency.lockutils [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] Lock "c7e9acc0-1427-4382-bcf8-99fdcc08aac0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1202.834780] env[68964]: DEBUG oslo_concurrency.lockutils [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] Acquiring lock "e437a43d-00b7-4feb-ae97-215238cf845b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1202.835030] env[68964]: DEBUG oslo_concurrency.lockutils [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] Lock "e437a43d-00b7-4feb-ae97-215238cf845b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1202.873033] env[68964]: DEBUG oslo_concurrency.lockutils [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] Acquiring lock "43a9c974-399f-44cb-b836-4bdf17a8d768" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1202.874389] env[68964]: DEBUG oslo_concurrency.lockutils [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] Lock "43a9c974-399f-44cb-b836-4bdf17a8d768" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1202.917917] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a0746bc-aa56-4128-9459-363325689938 tempest-ServerShowV254Test-294334972 tempest-ServerShowV254Test-294334972-project-member] Acquiring lock "530b7cf0-53d3-4bfb-b545-b0bb57dc91b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1202.917917] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a0746bc-aa56-4128-9459-363325689938 tempest-ServerShowV254Test-294334972 tempest-ServerShowV254Test-294334972-project-member] Lock "530b7cf0-53d3-4bfb-b545-b0bb57dc91b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1203.679585] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d147c4cc-1408-4300-b375-73a5bd6617e1 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] Acquiring lock "db7c702e-3d49-4eb7-9d7e-c715186f1f78" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1203.679825] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d147c4cc-1408-4300-b375-73a5bd6617e1 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] Lock "db7c702e-3d49-4eb7-9d7e-c715186f1f78" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1203.724578] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1203.724802] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1204.097819] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b226d23b-1a38-4898-8155-8dced1791fc6 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] Acquiring lock "178b3fc0-3f93-400f-ba19-d9703c62fd22" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1204.098143] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b226d23b-1a38-4898-8155-8dced1791fc6 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] Lock "178b3fc0-3f93-400f-ba19-d9703c62fd22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1204.476049] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9811f56a-6f83-4d8f-9e41-b6071e3cb4c7 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] Acquiring lock "7f64ec65-fc61-4c44-a489-a36b9f8750e2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1204.476181] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9811f56a-6f83-4d8f-9e41-b6071e3cb4c7 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] Lock "7f64ec65-fc61-4c44-a489-a36b9f8750e2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1204.724350] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1205.719267] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1205.723928] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1205.724161] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1205.724311] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances with incomplete migration {{(pid=68964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1206.726020] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1206.746928] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1206.747125] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1206.778477] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] There are 1 instances to clean {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1206.778744] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 07d247db-b7ca-4b5f-818f-17411296d08f] Instance has had 0 of 5 cleanup attempts {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11211}} [ 1206.818165] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1207.804633] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1207.804962] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1208.074570] env[68964]: WARNING oslo_vmware.rw_handles [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1208.074570] env[68964]: ERROR oslo_vmware.rw_handles [ 1208.075099] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/890570d8-e9d3-4e98-a50c-df684d749d60/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1208.076959] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1208.077223] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Copying Virtual Disk [datastore1] vmware_temp/890570d8-e9d3-4e98-a50c-df684d749d60/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/890570d8-e9d3-4e98-a50c-df684d749d60/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1208.077502] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d5dca491-c660-428c-ba01-bd8c5fa9a6e6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.085948] env[68964]: DEBUG oslo_vmware.api [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for the task: (returnval){ [ 1208.085948] env[68964]: value = "task-3431671" [ 1208.085948] env[68964]: _type = "Task" [ 1208.085948] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1208.093889] env[68964]: DEBUG oslo_vmware.api [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': task-3431671, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1208.596680] env[68964]: DEBUG oslo_vmware.exceptions [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1208.596992] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1208.597586] env[68964]: ERROR nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1208.597586] env[68964]: Faults: ['InvalidArgument'] [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Traceback (most recent call last): [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] yield resources [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] self.driver.spawn(context, instance, image_meta, [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] self._fetch_image_if_missing(context, vi) [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] image_cache(vi, tmp_image_ds_loc) [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] vm_util.copy_virtual_disk( [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] session._wait_for_task(vmdk_copy_task) [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] return self.wait_for_task(task_ref) [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] return evt.wait() [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] result = hub.switch() [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] return self.greenlet.switch() [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] self.f(*self.args, **self.kw) [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] raise exceptions.translate_fault(task_info.error) [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Faults: ['InvalidArgument'] [ 1208.597586] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] [ 1208.598737] env[68964]: INFO nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Terminating instance [ 1208.599462] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1208.599682] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1208.600303] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1208.600494] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1208.600716] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-35d2fea6-7bae-427d-8545-5bdd45cdad6d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.603074] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c74d176e-d777-4379-bdc0-2a85e4f062f8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.611017] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1208.612938] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c89b84c7-127a-432a-8828-827314aab064 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.614504] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1208.614777] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1208.615581] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dc8a74ff-cdf3-46e5-9162-6046d2b66ece {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.621455] env[68964]: DEBUG oslo_vmware.api [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for the task: (returnval){ [ 1208.621455] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]521ac569-da49-c269-3262-6e41468dd43b" [ 1208.621455] env[68964]: _type = "Task" [ 1208.621455] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1208.629639] env[68964]: DEBUG oslo_vmware.api [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]521ac569-da49-c269-3262-6e41468dd43b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1208.691441] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1208.691668] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1208.691890] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Deleting the datastore file [datastore1] b2d9a7ec-f565-49d3-8d0d-9339504f8a86 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1208.692228] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4352041b-9de2-4e26-b51f-110e0af210ac {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.697943] env[68964]: DEBUG oslo_vmware.api [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for the task: (returnval){ [ 1208.697943] env[68964]: value = "task-3431673" [ 1208.697943] env[68964]: _type = "Task" [ 1208.697943] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1208.705829] env[68964]: DEBUG oslo_vmware.api [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': task-3431673, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1208.724369] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1208.724554] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1208.724691] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1208.746863] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.747085] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.747269] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.747449] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.747623] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.747789] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.747940] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.748122] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.748289] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.748449] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1208.748599] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1208.749172] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1208.749419] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1208.760585] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1208.760858] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1208.761063] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1208.761253] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1208.762436] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caf65d65-33d1-4910-9c2b-dc0885e7b0d5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.771897] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8b8460d-e656-4d1b-84b5-f3b49961cc5c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.787865] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9c2b061-7b71-42f9-9faa-3bb4844b0dc8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.794850] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85694d75-3b6a-4501-b44a-370376e10c41 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.824405] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180882MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1208.824705] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1208.824705] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1208.960843] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.961054] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.961195] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f3f326c-2127-426e-a137-6f33512f4cb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.961319] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.961438] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 238794bb-9995-4bb0-954d-7ca0ef825e19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.961572] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4d272615-e2dd-4540-88d0-4a209f559147 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.961676] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.961792] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 244140d1-bf22-415a-b770-05f2fe106149 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.961906] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96c1b70b-9a17-46b1-999d-558b85c77d22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.962028] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a317d842-0282-4ace-a457-d8031cf0adca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1208.973339] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1208.983508] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 704ec14b-410e-4175-b032-69074b332d87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1208.992540] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.001717] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.011052] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.020747] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 15faae57-ab24-417e-9bf2-1aee11ccc2f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.030510] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.041031] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c7e9acc0-1427-4382-bcf8-99fdcc08aac0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.052034] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e437a43d-00b7-4feb-ae97-215238cf845b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.061763] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 43a9c974-399f-44cb-b836-4bdf17a8d768 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.073019] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 530b7cf0-53d3-4bfb-b545-b0bb57dc91b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.084017] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance db7c702e-3d49-4eb7-9d7e-c715186f1f78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.095024] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 178b3fc0-3f93-400f-ba19-d9703c62fd22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.104656] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f64ec65-fc61-4c44-a489-a36b9f8750e2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1209.104815] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1209.104963] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1209.133104] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1209.133377] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Creating directory with path [datastore1] vmware_temp/ab3a650f-9d55-4c07-b3cc-6f11c28bed3e/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1209.134076] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5ef4f29b-c271-4c27-b93b-ca0c5b56116b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.150050] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Created directory with path [datastore1] vmware_temp/ab3a650f-9d55-4c07-b3cc-6f11c28bed3e/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1209.150275] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Fetch image to [datastore1] vmware_temp/ab3a650f-9d55-4c07-b3cc-6f11c28bed3e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1209.150445] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/ab3a650f-9d55-4c07-b3cc-6f11c28bed3e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1209.151320] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acf84397-06da-4765-b27f-c2b25948992d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.161217] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b424934-f981-4fe1-b09d-e49e4b5e205f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.171016] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7577b449-6369-4d29-ae52-e26b9b5e7539 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.212898] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7dd5e80-1d8c-493f-991a-947ac9ee5c1b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.220683] env[68964]: DEBUG oslo_vmware.api [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Task: {'id': task-3431673, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.086004} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1209.223081] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1209.223081] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1209.223081] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1209.223081] env[68964]: INFO nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1209.226730] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f2fbd234-7fb8-4c40-8214-656d77084285 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.228707] env[68964]: DEBUG nova.compute.claims [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1209.228883] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1209.248301] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1209.300012] env[68964]: DEBUG oslo_vmware.rw_handles [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ab3a650f-9d55-4c07-b3cc-6f11c28bed3e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1209.358724] env[68964]: DEBUG oslo_vmware.rw_handles [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1209.358909] env[68964]: DEBUG oslo_vmware.rw_handles [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ab3a650f-9d55-4c07-b3cc-6f11c28bed3e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1209.463762] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-816ee5d3-e249-447a-9b07-6d92d799b919 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.471746] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e08e25c-c38b-43ec-8007-bb361aa69ddb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.501268] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20b9ba24-ec69-406a-a84f-a06b0a7c60ad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.508060] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f4ea832-b299-46f8-894b-b65381c2f8a3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.521640] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1209.529822] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1209.543772] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1209.543974] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.719s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1209.544289] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.315s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1209.869161] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d8238fa-409b-45f0-a70e-cbc939b399ef {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.879974] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bf7f588-f066-44c4-94bd-ba15262a21be {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.928981] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d815d2cf-566a-42b4-972a-3f623d1819d4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.939926] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67c71555-871e-409b-bf6f-81bcaa5b413c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.961670] env[68964]: DEBUG nova.compute.provider_tree [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1209.971102] env[68964]: DEBUG nova.scheduler.client.report [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1209.986405] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.442s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1209.986954] env[68964]: ERROR nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1209.986954] env[68964]: Faults: ['InvalidArgument'] [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Traceback (most recent call last): [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] self.driver.spawn(context, instance, image_meta, [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] self._fetch_image_if_missing(context, vi) [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] image_cache(vi, tmp_image_ds_loc) [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] vm_util.copy_virtual_disk( [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] session._wait_for_task(vmdk_copy_task) [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] return self.wait_for_task(task_ref) [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] return evt.wait() [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] result = hub.switch() [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] return self.greenlet.switch() [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] self.f(*self.args, **self.kw) [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] raise exceptions.translate_fault(task_info.error) [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Faults: ['InvalidArgument'] [ 1209.986954] env[68964]: ERROR nova.compute.manager [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] [ 1209.987884] env[68964]: DEBUG nova.compute.utils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1209.989086] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Build of instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 was re-scheduled: A specified parameter was not correct: fileType [ 1209.989086] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1209.989466] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1209.989642] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1209.989835] env[68964]: DEBUG nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1209.989956] env[68964]: DEBUG nova.network.neutron [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1210.335916] env[68964]: DEBUG nova.network.neutron [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1210.355198] env[68964]: INFO nova.compute.manager [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Took 0.36 seconds to deallocate network for instance. [ 1210.455484] env[68964]: INFO nova.scheduler.client.report [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Deleted allocations for instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 [ 1210.476291] env[68964]: DEBUG oslo_concurrency.lockutils [None req-68f7eefc-3c52-4f6e-8eee-3877b5e79b36 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 408.148s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1210.477468] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 211.190s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1210.477692] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Acquiring lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.477899] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1210.478079] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1210.480988] env[68964]: INFO nova.compute.manager [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Terminating instance [ 1210.483092] env[68964]: DEBUG nova.compute.manager [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1210.483496] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1210.484287] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3f0d80ed-44de-447c-815d-f38b5d4e3023 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.496492] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f74a8a8-3e0a-4d08-a762-07471a5fa74d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.517547] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1210.540645] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b2d9a7ec-f565-49d3-8d0d-9339504f8a86 could not be found. [ 1210.540867] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1210.541110] env[68964]: INFO nova.compute.manager [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Took 0.06 seconds to destroy the instance on the hypervisor. [ 1210.541300] env[68964]: DEBUG oslo.service.loopingcall [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1210.541535] env[68964]: DEBUG nova.compute.manager [-] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1210.541627] env[68964]: DEBUG nova.network.neutron [-] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1210.578952] env[68964]: DEBUG nova.network.neutron [-] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1210.595378] env[68964]: INFO nova.compute.manager [-] [instance: b2d9a7ec-f565-49d3-8d0d-9339504f8a86] Took 0.05 seconds to deallocate network for instance. [ 1210.611371] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.611634] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1210.613075] env[68964]: INFO nova.compute.claims [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1210.697811] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ffe9c776-b1f9-4cfb-b487-5972bde85018 tempest-MigrationsAdminTest-921595949 tempest-MigrationsAdminTest-921595949-project-member] Lock "b2d9a7ec-f565-49d3-8d0d-9339504f8a86" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.220s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1210.928140] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16474127-6a2c-457b-99bf-6f4c0203e97a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.936249] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14592420-cfeb-4474-bbfd-f0c8d5baa707 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.967962] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ab589e3-75b3-4ed4-b9c3-b56e41ada0d3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.976133] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e40173fe-00e6-456a-bbbd-e5be2e34d584 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.989481] env[68964]: DEBUG nova.compute.provider_tree [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1210.997846] env[68964]: DEBUG nova.scheduler.client.report [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1211.012077] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.400s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1211.012586] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1211.043108] env[68964]: DEBUG nova.compute.utils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1211.044496] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Not allocating networking since 'none' was specified. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1211.054724] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1211.117475] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1211.142398] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1211.142654] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1211.142812] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1211.143075] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1211.143234] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1211.143382] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1211.143590] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1211.143745] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1211.144171] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1211.144335] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1211.144514] env[68964]: DEBUG nova.virt.hardware [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1211.145371] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a1b2add-1ea7-43ac-a2da-39fb1465a70f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.153366] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5754640-185a-45c4-b6f6-5ffc820817a0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.167905] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Instance VIF info [] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1211.173440] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Creating folder: Project (ff05eacfe6854d89935f2b52981aa8b6). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1211.173753] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fd9bed01-18cb-4cf0-8490-1aeb5af6a7d2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.184310] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Created folder: Project (ff05eacfe6854d89935f2b52981aa8b6) in parent group-v684465. [ 1211.184524] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Creating folder: Instances. Parent ref: group-v684560. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1211.185031] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6413c555-f56d-42af-b1df-479f43f4323c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.193384] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Created folder: Instances in parent group-v684560. [ 1211.193617] env[68964]: DEBUG oslo.service.loopingcall [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1211.193788] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1211.194069] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3f7e0d09-7a02-4606-ae48-f5947cedb51d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.211354] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1211.211354] env[68964]: value = "task-3431676" [ 1211.211354] env[68964]: _type = "Task" [ 1211.211354] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1211.218671] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431676, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1211.722832] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431676, 'name': CreateVM_Task, 'duration_secs': 0.260487} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1211.723041] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1211.723451] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1211.723606] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1211.723940] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1211.724198] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3c6360ff-70b6-48bc-bdf7-ce890af21263 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.728387] env[68964]: DEBUG oslo_vmware.api [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Waiting for the task: (returnval){ [ 1211.728387] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52b9579a-b9cf-6e56-74be-14c82edbe819" [ 1211.728387] env[68964]: _type = "Task" [ 1211.728387] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1211.735466] env[68964]: DEBUG oslo_vmware.api [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52b9579a-b9cf-6e56-74be-14c82edbe819, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1212.238375] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1212.238728] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1212.238839] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1220.587622] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1220.587933] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1242.594901] env[68964]: WARNING oslo_vmware.rw_handles [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1242.594901] env[68964]: ERROR oslo_vmware.rw_handles [ 1242.595637] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/97b511ec-c073-4468-b3e6-cdb9ac1ddc68/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1242.597585] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1242.597858] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Copying Virtual Disk [datastore2] vmware_temp/97b511ec-c073-4468-b3e6-cdb9ac1ddc68/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/97b511ec-c073-4468-b3e6-cdb9ac1ddc68/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1242.598555] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bfcec782-4ecd-4025-a68a-8a88bb47c5ed {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1242.607558] env[68964]: DEBUG oslo_vmware.api [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Waiting for the task: (returnval){ [ 1242.607558] env[68964]: value = "task-3431677" [ 1242.607558] env[68964]: _type = "Task" [ 1242.607558] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1242.615988] env[68964]: DEBUG oslo_vmware.api [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Task: {'id': task-3431677, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1243.118105] env[68964]: DEBUG oslo_vmware.exceptions [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1243.118474] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1243.119075] env[68964]: ERROR nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1243.119075] env[68964]: Faults: ['InvalidArgument'] [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Traceback (most recent call last): [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] yield resources [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] self.driver.spawn(context, instance, image_meta, [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] self._fetch_image_if_missing(context, vi) [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] image_cache(vi, tmp_image_ds_loc) [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] vm_util.copy_virtual_disk( [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] session._wait_for_task(vmdk_copy_task) [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] return self.wait_for_task(task_ref) [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] return evt.wait() [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] result = hub.switch() [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] return self.greenlet.switch() [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] self.f(*self.args, **self.kw) [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] raise exceptions.translate_fault(task_info.error) [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Faults: ['InvalidArgument'] [ 1243.119075] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] [ 1243.120158] env[68964]: INFO nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Terminating instance [ 1243.121084] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1243.121331] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1243.121580] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-489009ed-e22b-43a5-b61b-9b33798a7aaf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.124198] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1243.124387] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1243.125139] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-746af80d-355a-4de6-8319-8c5716152750 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.132069] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1243.132285] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9f2fe9f8-6972-4679-bd59-93af59b3afc0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.134537] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1243.134711] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1243.135662] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2aedc5f5-fb82-45d8-8e1c-cd58beb2aa10 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.141155] env[68964]: DEBUG oslo_vmware.api [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 1243.141155] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5279a5a6-51b1-9f8d-8c7c-7573a2d5df1c" [ 1243.141155] env[68964]: _type = "Task" [ 1243.141155] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1243.148926] env[68964]: DEBUG oslo_vmware.api [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5279a5a6-51b1-9f8d-8c7c-7573a2d5df1c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1243.200219] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1243.200445] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1243.200623] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Deleting the datastore file [datastore2] 7f3f326c-2127-426e-a137-6f33512f4cb2 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1243.200888] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dd0fc9a7-4938-45d4-a689-413ff467976d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.207350] env[68964]: DEBUG oslo_vmware.api [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Waiting for the task: (returnval){ [ 1243.207350] env[68964]: value = "task-3431679" [ 1243.207350] env[68964]: _type = "Task" [ 1243.207350] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1243.215644] env[68964]: DEBUG oslo_vmware.api [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Task: {'id': task-3431679, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1243.651854] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1243.652141] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating directory with path [datastore2] vmware_temp/ad25e5bc-2bd3-4e5e-895f-47a9ea285100/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1243.652380] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d12df0a6-97ab-47d3-8fcf-dc713ae54fff {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.664180] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Created directory with path [datastore2] vmware_temp/ad25e5bc-2bd3-4e5e-895f-47a9ea285100/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1243.664394] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Fetch image to [datastore2] vmware_temp/ad25e5bc-2bd3-4e5e-895f-47a9ea285100/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1243.664567] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore2] vmware_temp/ad25e5bc-2bd3-4e5e-895f-47a9ea285100/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1243.665315] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-500bf579-1ec9-4487-bd26-16746f05bd71 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.671974] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7dcbc11-6cf6-4111-8e93-bff6a96ffb51 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.680970] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a946f458-f397-4267-93fc-b851ce1d351e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.716188] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-658cd286-4be9-4499-9654-f6fdff49778e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.723284] env[68964]: DEBUG oslo_vmware.api [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Task: {'id': task-3431679, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078152} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1243.724799] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1243.725031] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1243.725224] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1243.725426] env[68964]: INFO nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1243.727178] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1adf9ef7-d19a-4334-b6f5-6b818b3565d7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1243.729105] env[68964]: DEBUG nova.compute.claims [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1243.729282] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1243.729489] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1243.754200] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1243.792652] env[68964]: DEBUG nova.scheduler.client.report [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Refreshing inventories for resource provider 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1243.811185] env[68964]: DEBUG nova.scheduler.client.report [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Updating ProviderTree inventory for provider 63b0294e-f555-48a6-a542-3466427066a9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1243.811185] env[68964]: DEBUG nova.compute.provider_tree [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Updating inventory in ProviderTree for provider 63b0294e-f555-48a6-a542-3466427066a9 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1243.813364] env[68964]: DEBUG oslo_vmware.rw_handles [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ad25e5bc-2bd3-4e5e-895f-47a9ea285100/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1243.872452] env[68964]: DEBUG nova.scheduler.client.report [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Refreshing aggregate associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, aggregates: None {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1243.876517] env[68964]: DEBUG oslo_vmware.rw_handles [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1243.876686] env[68964]: DEBUG oslo_vmware.rw_handles [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ad25e5bc-2bd3-4e5e-895f-47a9ea285100/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1243.891989] env[68964]: DEBUG nova.scheduler.client.report [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Refreshing trait associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1244.183583] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd646a96-2e4c-4b01-a94d-6c5ffb2705f7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.192306] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-729b9606-7d84-4455-9f32-184481b698d0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.224131] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ccbaa5f-f893-4ea1-9fc3-562bef20a374 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.231813] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9308ce2a-8ffb-417d-8597-883ca4c337f4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.244970] env[68964]: DEBUG nova.compute.provider_tree [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1244.254654] env[68964]: DEBUG nova.scheduler.client.report [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1244.271211] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.542s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1244.271751] env[68964]: ERROR nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1244.271751] env[68964]: Faults: ['InvalidArgument'] [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Traceback (most recent call last): [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] self.driver.spawn(context, instance, image_meta, [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] self._fetch_image_if_missing(context, vi) [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] image_cache(vi, tmp_image_ds_loc) [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] vm_util.copy_virtual_disk( [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] session._wait_for_task(vmdk_copy_task) [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] return self.wait_for_task(task_ref) [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] return evt.wait() [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] result = hub.switch() [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] return self.greenlet.switch() [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] self.f(*self.args, **self.kw) [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] raise exceptions.translate_fault(task_info.error) [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Faults: ['InvalidArgument'] [ 1244.271751] env[68964]: ERROR nova.compute.manager [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] [ 1244.272617] env[68964]: DEBUG nova.compute.utils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1244.275031] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Build of instance 7f3f326c-2127-426e-a137-6f33512f4cb2 was re-scheduled: A specified parameter was not correct: fileType [ 1244.275031] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1244.275407] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1244.275597] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1244.275770] env[68964]: DEBUG nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1244.275931] env[68964]: DEBUG nova.network.neutron [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1244.661913] env[68964]: DEBUG nova.network.neutron [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1244.672759] env[68964]: INFO nova.compute.manager [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Took 0.40 seconds to deallocate network for instance. [ 1244.759631] env[68964]: INFO nova.scheduler.client.report [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Deleted allocations for instance 7f3f326c-2127-426e-a137-6f33512f4cb2 [ 1244.780829] env[68964]: DEBUG oslo_concurrency.lockutils [None req-91b651cb-bca6-40c4-b605-f3f5fdb3e96e tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "7f3f326c-2127-426e-a137-6f33512f4cb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 388.192s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1244.781998] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "7f3f326c-2127-426e-a137-6f33512f4cb2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 192.366s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1244.782525] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Acquiring lock "7f3f326c-2127-426e-a137-6f33512f4cb2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1244.782525] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "7f3f326c-2127-426e-a137-6f33512f4cb2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1244.782693] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "7f3f326c-2127-426e-a137-6f33512f4cb2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1244.784856] env[68964]: INFO nova.compute.manager [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Terminating instance [ 1244.786578] env[68964]: DEBUG nova.compute.manager [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1244.786768] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1244.787248] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0dc0369f-9269-4347-a652-b719aef31878 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.791111] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1244.797604] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b627529-c866-47d0-986d-63a4e303f1b4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1244.827228] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7f3f326c-2127-426e-a137-6f33512f4cb2 could not be found. [ 1244.827443] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1244.827625] env[68964]: INFO nova.compute.manager [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1244.827937] env[68964]: DEBUG oslo.service.loopingcall [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1244.830059] env[68964]: DEBUG nova.compute.manager [-] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1244.830161] env[68964]: DEBUG nova.network.neutron [-] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1244.845864] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1244.846132] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1244.847699] env[68964]: INFO nova.compute.claims [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1244.865987] env[68964]: DEBUG nova.network.neutron [-] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1244.887187] env[68964]: INFO nova.compute.manager [-] [instance: 7f3f326c-2127-426e-a137-6f33512f4cb2] Took 0.06 seconds to deallocate network for instance. [ 1244.985037] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c74623f-678d-472a-909c-7face8fd0318 tempest-FloatingIPsAssociationTestJSON-616192065 tempest-FloatingIPsAssociationTestJSON-616192065-project-member] Lock "7f3f326c-2127-426e-a137-6f33512f4cb2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.203s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.168467] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7ce4a50-1104-4504-9dda-65e9e2f906a2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.176118] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10b8dd30-52d7-4ab8-9199-1bee6f43a3a9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.207478] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aa8d2ee-0246-4c79-bbe7-68591b90bf51 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.215175] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7902914-2891-4951-960f-d7c8ee793491 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.230717] env[68964]: DEBUG nova.compute.provider_tree [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1245.239272] env[68964]: DEBUG nova.scheduler.client.report [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1245.255316] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.409s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1245.255809] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1245.288610] env[68964]: DEBUG nova.compute.utils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1245.290109] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1245.290285] env[68964]: DEBUG nova.network.neutron [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1245.298505] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1245.359926] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1245.384177] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1245.384443] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1245.384598] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1245.384779] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1245.384924] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1245.385071] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1245.385282] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1245.385443] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1245.385609] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1245.385768] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1245.385937] env[68964]: DEBUG nova.virt.hardware [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1245.386782] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4438f4a4-f112-41c7-b062-d9a5ff958e51 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1245.390493] env[68964]: DEBUG nova.policy [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cdf2567a5f234d3ca11c17b2a6c50dab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3159a58c1d23417eb9c756a88435d17e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1245.396955] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-059e145d-7e3f-4c90-9dc4-014aaa482163 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.217205] env[68964]: DEBUG nova.network.neutron [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Successfully created port: 02b10154-efaf-4ea5-a202-538d0e3b9052 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1247.405423] env[68964]: DEBUG nova.network.neutron [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Successfully updated port: 02b10154-efaf-4ea5-a202-538d0e3b9052 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1247.414237] env[68964]: DEBUG nova.compute.manager [req-b616cbba-d539-4a65-aa9d-4d416d7317ae req-8d6b4522-26d7-4f4f-9f4c-53e04f8b6e20 service nova] [instance: 704ec14b-410e-4175-b032-69074b332d87] Received event network-vif-plugged-02b10154-efaf-4ea5-a202-538d0e3b9052 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1247.418375] env[68964]: DEBUG oslo_concurrency.lockutils [req-b616cbba-d539-4a65-aa9d-4d416d7317ae req-8d6b4522-26d7-4f4f-9f4c-53e04f8b6e20 service nova] Acquiring lock "704ec14b-410e-4175-b032-69074b332d87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.418630] env[68964]: DEBUG oslo_concurrency.lockutils [req-b616cbba-d539-4a65-aa9d-4d416d7317ae req-8d6b4522-26d7-4f4f-9f4c-53e04f8b6e20 service nova] Lock "704ec14b-410e-4175-b032-69074b332d87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1247.418805] env[68964]: DEBUG oslo_concurrency.lockutils [req-b616cbba-d539-4a65-aa9d-4d416d7317ae req-8d6b4522-26d7-4f4f-9f4c-53e04f8b6e20 service nova] Lock "704ec14b-410e-4175-b032-69074b332d87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1247.419159] env[68964]: DEBUG nova.compute.manager [req-b616cbba-d539-4a65-aa9d-4d416d7317ae req-8d6b4522-26d7-4f4f-9f4c-53e04f8b6e20 service nova] [instance: 704ec14b-410e-4175-b032-69074b332d87] No waiting events found dispatching network-vif-plugged-02b10154-efaf-4ea5-a202-538d0e3b9052 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1247.419332] env[68964]: WARNING nova.compute.manager [req-b616cbba-d539-4a65-aa9d-4d416d7317ae req-8d6b4522-26d7-4f4f-9f4c-53e04f8b6e20 service nova] [instance: 704ec14b-410e-4175-b032-69074b332d87] Received unexpected event network-vif-plugged-02b10154-efaf-4ea5-a202-538d0e3b9052 for instance with vm_state building and task_state spawning. [ 1247.421916] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "refresh_cache-704ec14b-410e-4175-b032-69074b332d87" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1247.422263] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired lock "refresh_cache-704ec14b-410e-4175-b032-69074b332d87" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1247.422545] env[68964]: DEBUG nova.network.neutron [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1247.478282] env[68964]: DEBUG nova.network.neutron [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1247.567260] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_power_states {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1247.598341] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Getting list of instances from cluster (obj){ [ 1247.598341] env[68964]: value = "domain-c8" [ 1247.598341] env[68964]: _type = "ClusterComputeResource" [ 1247.598341] env[68964]: } {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1247.599816] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37c1a0a8-0147-4ff4-83ec-c89fdeb3dd58 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.618439] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Got total of 9 instances {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1247.618646] env[68964]: WARNING nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] While synchronizing instance power states, found 10 instances in the database and 9 instances on the hypervisor. [ 1247.618793] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.618990] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 2d0469ba-ad42-4b06-ade2-cd64487278c5 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.619185] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 238794bb-9995-4bb0-954d-7ca0ef825e19 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.619346] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 4d272615-e2dd-4540-88d0-4a209f559147 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.619495] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.619645] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 244140d1-bf22-415a-b770-05f2fe106149 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.619791] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 96c1b70b-9a17-46b1-999d-558b85c77d22 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.619936] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid a317d842-0282-4ace-a457-d8031cf0adca {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.620127] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 8ee9e517-075e-4faf-9740-32f8fa585eb5 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.620290] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 704ec14b-410e-4175-b032-69074b332d87 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1247.620561] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.620797] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "2d0469ba-ad42-4b06-ade2-cd64487278c5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.621366] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "238794bb-9995-4bb0-954d-7ca0ef825e19" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.621366] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "4d272615-e2dd-4540-88d0-4a209f559147" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.621493] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.621649] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "244140d1-bf22-415a-b770-05f2fe106149" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.621850] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "96c1b70b-9a17-46b1-999d-558b85c77d22" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.622079] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "a317d842-0282-4ace-a457-d8031cf0adca" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.622251] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "8ee9e517-075e-4faf-9740-32f8fa585eb5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.622463] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "704ec14b-410e-4175-b032-69074b332d87" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1247.978487] env[68964]: DEBUG nova.network.neutron [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Updating instance_info_cache with network_info: [{"id": "02b10154-efaf-4ea5-a202-538d0e3b9052", "address": "fa:16:3e:24:25:d3", "network": {"id": "5a19a352-d2c7-4a12-9da0-8d1e2ce3c0b7", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1329520651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3159a58c1d23417eb9c756a88435d17e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap02b10154-ef", "ovs_interfaceid": "02b10154-efaf-4ea5-a202-538d0e3b9052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1247.990876] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Releasing lock "refresh_cache-704ec14b-410e-4175-b032-69074b332d87" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1247.991462] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Instance network_info: |[{"id": "02b10154-efaf-4ea5-a202-538d0e3b9052", "address": "fa:16:3e:24:25:d3", "network": {"id": "5a19a352-d2c7-4a12-9da0-8d1e2ce3c0b7", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1329520651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3159a58c1d23417eb9c756a88435d17e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap02b10154-ef", "ovs_interfaceid": "02b10154-efaf-4ea5-a202-538d0e3b9052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1247.991612] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:25:d3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '02b10154-efaf-4ea5-a202-538d0e3b9052', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1247.999080] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating folder: Project (3159a58c1d23417eb9c756a88435d17e). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1247.999620] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-79548912-9d42-4b53-9bfc-bddd111270f2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.013951] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Created folder: Project (3159a58c1d23417eb9c756a88435d17e) in parent group-v684465. [ 1248.013951] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating folder: Instances. Parent ref: group-v684563. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1248.013951] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dde2f5f2-43df-4a8f-a339-dce3776824c8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.023538] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Created folder: Instances in parent group-v684563. [ 1248.023771] env[68964]: DEBUG oslo.service.loopingcall [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1248.023954] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 704ec14b-410e-4175-b032-69074b332d87] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1248.024185] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-38fb4de4-5cab-4c25-8ff6-78963ce737dd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.044631] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1248.044631] env[68964]: value = "task-3431682" [ 1248.044631] env[68964]: _type = "Task" [ 1248.044631] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1248.051957] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431682, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1248.554214] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431682, 'name': CreateVM_Task} progress is 99%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1249.054849] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431682, 'name': CreateVM_Task, 'duration_secs': 0.790791} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1249.055183] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 704ec14b-410e-4175-b032-69074b332d87] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1249.055749] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1249.055911] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1249.056258] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1249.056570] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ba9744fc-eb1c-4434-98a3-6220bad3288f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1249.062065] env[68964]: DEBUG oslo_vmware.api [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 1249.062065] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]529262f6-2c0c-b69b-6e83-9bfb3131db16" [ 1249.062065] env[68964]: _type = "Task" [ 1249.062065] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1249.071316] env[68964]: DEBUG oslo_vmware.api [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]529262f6-2c0c-b69b-6e83-9bfb3131db16, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1249.436971] env[68964]: DEBUG nova.compute.manager [req-c7183b88-96f2-4679-879b-a2f71cb16cb4 req-e8bccad0-ce90-4ad5-81f2-a4249edc586a service nova] [instance: 704ec14b-410e-4175-b032-69074b332d87] Received event network-changed-02b10154-efaf-4ea5-a202-538d0e3b9052 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1249.437131] env[68964]: DEBUG nova.compute.manager [req-c7183b88-96f2-4679-879b-a2f71cb16cb4 req-e8bccad0-ce90-4ad5-81f2-a4249edc586a service nova] [instance: 704ec14b-410e-4175-b032-69074b332d87] Refreshing instance network info cache due to event network-changed-02b10154-efaf-4ea5-a202-538d0e3b9052. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1249.437371] env[68964]: DEBUG oslo_concurrency.lockutils [req-c7183b88-96f2-4679-879b-a2f71cb16cb4 req-e8bccad0-ce90-4ad5-81f2-a4249edc586a service nova] Acquiring lock "refresh_cache-704ec14b-410e-4175-b032-69074b332d87" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1249.437576] env[68964]: DEBUG oslo_concurrency.lockutils [req-c7183b88-96f2-4679-879b-a2f71cb16cb4 req-e8bccad0-ce90-4ad5-81f2-a4249edc586a service nova] Acquired lock "refresh_cache-704ec14b-410e-4175-b032-69074b332d87" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1249.437677] env[68964]: DEBUG nova.network.neutron [req-c7183b88-96f2-4679-879b-a2f71cb16cb4 req-e8bccad0-ce90-4ad5-81f2-a4249edc586a service nova] [instance: 704ec14b-410e-4175-b032-69074b332d87] Refreshing network info cache for port 02b10154-efaf-4ea5-a202-538d0e3b9052 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1249.575797] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1249.576761] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1249.577151] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1249.825069] env[68964]: DEBUG nova.network.neutron [req-c7183b88-96f2-4679-879b-a2f71cb16cb4 req-e8bccad0-ce90-4ad5-81f2-a4249edc586a service nova] [instance: 704ec14b-410e-4175-b032-69074b332d87] Updated VIF entry in instance network info cache for port 02b10154-efaf-4ea5-a202-538d0e3b9052. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1249.825463] env[68964]: DEBUG nova.network.neutron [req-c7183b88-96f2-4679-879b-a2f71cb16cb4 req-e8bccad0-ce90-4ad5-81f2-a4249edc586a service nova] [instance: 704ec14b-410e-4175-b032-69074b332d87] Updating instance_info_cache with network_info: [{"id": "02b10154-efaf-4ea5-a202-538d0e3b9052", "address": "fa:16:3e:24:25:d3", "network": {"id": "5a19a352-d2c7-4a12-9da0-8d1e2ce3c0b7", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1329520651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3159a58c1d23417eb9c756a88435d17e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap02b10154-ef", "ovs_interfaceid": "02b10154-efaf-4ea5-a202-538d0e3b9052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1249.834949] env[68964]: DEBUG oslo_concurrency.lockutils [req-c7183b88-96f2-4679-879b-a2f71cb16cb4 req-e8bccad0-ce90-4ad5-81f2-a4249edc586a service nova] Releasing lock "refresh_cache-704ec14b-410e-4175-b032-69074b332d87" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1258.094205] env[68964]: WARNING oslo_vmware.rw_handles [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1258.094205] env[68964]: ERROR oslo_vmware.rw_handles [ 1258.094205] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/ab3a650f-9d55-4c07-b3cc-6f11c28bed3e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1258.095840] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1258.096090] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Copying Virtual Disk [datastore1] vmware_temp/ab3a650f-9d55-4c07-b3cc-6f11c28bed3e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/ab3a650f-9d55-4c07-b3cc-6f11c28bed3e/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1258.096377] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cbe9da1c-e58f-40a1-8505-b7360ecff430 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.103946] env[68964]: DEBUG oslo_vmware.api [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for the task: (returnval){ [ 1258.103946] env[68964]: value = "task-3431683" [ 1258.103946] env[68964]: _type = "Task" [ 1258.103946] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1258.111740] env[68964]: DEBUG oslo_vmware.api [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': task-3431683, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1258.614285] env[68964]: DEBUG oslo_vmware.exceptions [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1258.614505] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1258.615076] env[68964]: ERROR nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1258.615076] env[68964]: Faults: ['InvalidArgument'] [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Traceback (most recent call last): [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] yield resources [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] self.driver.spawn(context, instance, image_meta, [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] self._fetch_image_if_missing(context, vi) [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] image_cache(vi, tmp_image_ds_loc) [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] vm_util.copy_virtual_disk( [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] session._wait_for_task(vmdk_copy_task) [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] return self.wait_for_task(task_ref) [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] return evt.wait() [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] result = hub.switch() [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] return self.greenlet.switch() [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] self.f(*self.args, **self.kw) [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] raise exceptions.translate_fault(task_info.error) [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Faults: ['InvalidArgument'] [ 1258.615076] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] [ 1258.616198] env[68964]: INFO nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Terminating instance [ 1258.616920] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1258.617321] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1258.617585] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-82169ccf-6427-4339-b4aa-0852af184f0f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.619908] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1258.620119] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1258.620834] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77e5d1a1-b34e-42a8-94d0-a6868894a309 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.627879] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1258.628156] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6b1a313d-201f-4b89-b6fd-86edcb2f0f59 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.630691] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1258.630875] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1258.631932] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e76c1a2e-549f-48c8-9dd7-847b2fcde3e1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.636901] env[68964]: DEBUG oslo_vmware.api [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Waiting for the task: (returnval){ [ 1258.636901] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52567a3a-7dd1-a85b-d3aa-b0447ae303dc" [ 1258.636901] env[68964]: _type = "Task" [ 1258.636901] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1258.643727] env[68964]: DEBUG oslo_vmware.api [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52567a3a-7dd1-a85b-d3aa-b0447ae303dc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1258.699546] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1258.699752] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1258.699927] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Deleting the datastore file [datastore1] 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1258.700212] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4eca2449-48c6-4bc4-89bf-2c4bc716a193 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1258.705905] env[68964]: DEBUG oslo_vmware.api [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for the task: (returnval){ [ 1258.705905] env[68964]: value = "task-3431685" [ 1258.705905] env[68964]: _type = "Task" [ 1258.705905] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1258.713576] env[68964]: DEBUG oslo_vmware.api [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': task-3431685, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1259.147786] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1259.148082] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Creating directory with path [datastore1] vmware_temp/225e0133-5d01-4e7b-9950-3d6ef5707a2f/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1259.148171] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d14bd3f4-d797-4ff7-a93f-78d9b123e9ed {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.159665] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Created directory with path [datastore1] vmware_temp/225e0133-5d01-4e7b-9950-3d6ef5707a2f/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1259.159866] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Fetch image to [datastore1] vmware_temp/225e0133-5d01-4e7b-9950-3d6ef5707a2f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1259.160046] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/225e0133-5d01-4e7b-9950-3d6ef5707a2f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1259.160791] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c02f004-112f-4fec-9e5b-e81d53786ffd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.167597] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b619d96c-1e20-4e19-ad63-23d09cfa7623 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.176702] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19a16921-ba53-4ef4-b49c-8713569a6903 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.211513] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9ca5d86-8b5e-4e8e-8072-df56b0219a2c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.218620] env[68964]: DEBUG oslo_vmware.api [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Task: {'id': task-3431685, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065828} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1259.220129] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1259.220327] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1259.220504] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1259.220678] env[68964]: INFO nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1259.222486] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-715d8649-d90e-40fc-8768-45d89df4d8ce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.224435] env[68964]: DEBUG nova.compute.claims [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1259.224602] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1259.224812] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1259.246559] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1259.428801] env[68964]: DEBUG oslo_vmware.rw_handles [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/225e0133-5d01-4e7b-9950-3d6ef5707a2f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1259.487358] env[68964]: DEBUG oslo_vmware.rw_handles [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1259.487551] env[68964]: DEBUG oslo_vmware.rw_handles [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/225e0133-5d01-4e7b-9950-3d6ef5707a2f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1259.602631] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7f7d9de-98c4-432d-a52a-62cec144a8bd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.610494] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fb0c73b-3b70-4d4c-ac54-b1a03023629d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.641037] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc0acad6-8fc9-406b-91b9-58aa1a9cbb7d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.648412] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbea1dc2-840f-404a-9e07-3dbda743f8a7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1259.662474] env[68964]: DEBUG nova.compute.provider_tree [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1259.670025] env[68964]: DEBUG nova.scheduler.client.report [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1259.685716] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.461s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1259.686314] env[68964]: ERROR nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1259.686314] env[68964]: Faults: ['InvalidArgument'] [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Traceback (most recent call last): [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] self.driver.spawn(context, instance, image_meta, [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] self._fetch_image_if_missing(context, vi) [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] image_cache(vi, tmp_image_ds_loc) [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] vm_util.copy_virtual_disk( [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] session._wait_for_task(vmdk_copy_task) [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] return self.wait_for_task(task_ref) [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] return evt.wait() [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] result = hub.switch() [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] return self.greenlet.switch() [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] self.f(*self.args, **self.kw) [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] raise exceptions.translate_fault(task_info.error) [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Faults: ['InvalidArgument'] [ 1259.686314] env[68964]: ERROR nova.compute.manager [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] [ 1259.687386] env[68964]: DEBUG nova.compute.utils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1259.688951] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Build of instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 was re-scheduled: A specified parameter was not correct: fileType [ 1259.688951] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1259.689344] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1259.689520] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1259.689689] env[68964]: DEBUG nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1259.689851] env[68964]: DEBUG nova.network.neutron [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1260.252432] env[68964]: DEBUG nova.network.neutron [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1260.263437] env[68964]: INFO nova.compute.manager [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Took 0.57 seconds to deallocate network for instance. [ 1260.374040] env[68964]: INFO nova.scheduler.client.report [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Deleted allocations for instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 [ 1260.402211] env[68964]: DEBUG oslo_concurrency.lockutils [None req-73f3c9e6-727a-46e2-ac43-3cdaae15bab2 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 440.588s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1260.403458] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 244.372s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1260.403686] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Acquiring lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1260.403896] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1260.404080] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1260.406029] env[68964]: INFO nova.compute.manager [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Terminating instance [ 1260.407596] env[68964]: DEBUG nova.compute.manager [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1260.407790] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1260.408250] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b5e988ef-08eb-4c5f-93a2-700cfe6e40be {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.413558] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1260.420276] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fca31e1c-95c1-4761-b337-7184d8f3f691 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.448618] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8e77ed0b-ea43-4c15-94de-63c4e9d5e048 could not be found. [ 1260.448739] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1260.448915] env[68964]: INFO nova.compute.manager [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1260.449177] env[68964]: DEBUG oslo.service.loopingcall [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1260.451407] env[68964]: DEBUG nova.compute.manager [-] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1260.451484] env[68964]: DEBUG nova.network.neutron [-] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1260.468325] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1260.468873] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1260.470727] env[68964]: INFO nova.compute.claims [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1260.481312] env[68964]: DEBUG nova.network.neutron [-] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1260.499236] env[68964]: INFO nova.compute.manager [-] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] Took 0.05 seconds to deallocate network for instance. [ 1260.593068] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c090c0f-eb5e-4bc8-b18d-bc7c5451fcd0 tempest-SecurityGroupsTestJSON-791381166 tempest-SecurityGroupsTestJSON-791381166-project-member] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.190s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1260.597126] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 12.973s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1260.597297] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8e77ed0b-ea43-4c15-94de-63c4e9d5e048] During sync_power_state the instance has a pending task (deleting). Skip. [ 1260.597479] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "8e77ed0b-ea43-4c15-94de-63c4e9d5e048" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.004s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1260.780576] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72a7309d-abc7-427f-ab90-941b86ef88d6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.788666] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f89fc432-440e-4ff0-adfa-15e2a06a1cec {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.818010] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8aa8b8fc-1d53-491f-a9ca-c0a2cbed9de4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.824689] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a1d38dd-75d4-4a00-861d-b7e1b8fde6d6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1260.837206] env[68964]: DEBUG nova.compute.provider_tree [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1260.845966] env[68964]: DEBUG nova.scheduler.client.report [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1260.861348] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.393s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1260.861883] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1260.893736] env[68964]: DEBUG nova.compute.utils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1260.895238] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1260.895417] env[68964]: DEBUG nova.network.neutron [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1260.903561] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1260.959766] env[68964]: DEBUG nova.policy [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77c1300744bb43e8a66aaa4d9c96310b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '722d3fee2ed441a798e10ac2f905df25', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1260.970331] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1260.998202] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1260.998510] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1260.998667] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1260.998846] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1260.998991] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1260.999190] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1260.999366] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1260.999526] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1260.999688] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1260.999875] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1261.000012] env[68964]: DEBUG nova.virt.hardware [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1261.000855] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c7ce50f-bd9d-433a-a078-200b4ccce193 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.008666] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b169005-c5f8-469c-be48-69f0c31f9673 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.331127] env[68964]: DEBUG nova.network.neutron [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Successfully created port: d30ee0e5-094a-4fb0-87c0-6d7ae4400745 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1262.089676] env[68964]: DEBUG nova.network.neutron [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Successfully updated port: d30ee0e5-094a-4fb0-87c0-6d7ae4400745 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1262.102139] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "refresh_cache-ec476af0-9287-4f82-a4cd-c2a3771f1b68" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1262.102531] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquired lock "refresh_cache-ec476af0-9287-4f82-a4cd-c2a3771f1b68" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1262.102531] env[68964]: DEBUG nova.network.neutron [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1262.164427] env[68964]: DEBUG nova.network.neutron [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1262.298486] env[68964]: DEBUG nova.compute.manager [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Received event network-vif-plugged-d30ee0e5-094a-4fb0-87c0-6d7ae4400745 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1262.298702] env[68964]: DEBUG oslo_concurrency.lockutils [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] Acquiring lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1262.298909] env[68964]: DEBUG oslo_concurrency.lockutils [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] Lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1262.299087] env[68964]: DEBUG oslo_concurrency.lockutils [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] Lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1262.299260] env[68964]: DEBUG nova.compute.manager [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] No waiting events found dispatching network-vif-plugged-d30ee0e5-094a-4fb0-87c0-6d7ae4400745 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1262.299423] env[68964]: WARNING nova.compute.manager [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Received unexpected event network-vif-plugged-d30ee0e5-094a-4fb0-87c0-6d7ae4400745 for instance with vm_state building and task_state spawning. [ 1262.299579] env[68964]: DEBUG nova.compute.manager [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Received event network-changed-d30ee0e5-094a-4fb0-87c0-6d7ae4400745 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1262.299731] env[68964]: DEBUG nova.compute.manager [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Refreshing instance network info cache due to event network-changed-d30ee0e5-094a-4fb0-87c0-6d7ae4400745. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1262.299893] env[68964]: DEBUG oslo_concurrency.lockutils [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] Acquiring lock "refresh_cache-ec476af0-9287-4f82-a4cd-c2a3771f1b68" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1262.416608] env[68964]: DEBUG nova.network.neutron [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Updating instance_info_cache with network_info: [{"id": "d30ee0e5-094a-4fb0-87c0-6d7ae4400745", "address": "fa:16:3e:9d:fc:38", "network": {"id": "f22cadb9-32a6-4c38-a3b4-061e25205774", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1032554889-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "722d3fee2ed441a798e10ac2f905df25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b399c74-1411-408a-b4cd-84e268ae83fe", "external-id": "nsx-vlan-transportzone-486", "segmentation_id": 486, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd30ee0e5-09", "ovs_interfaceid": "d30ee0e5-094a-4fb0-87c0-6d7ae4400745", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1262.427734] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Releasing lock "refresh_cache-ec476af0-9287-4f82-a4cd-c2a3771f1b68" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1262.428122] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Instance network_info: |[{"id": "d30ee0e5-094a-4fb0-87c0-6d7ae4400745", "address": "fa:16:3e:9d:fc:38", "network": {"id": "f22cadb9-32a6-4c38-a3b4-061e25205774", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1032554889-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "722d3fee2ed441a798e10ac2f905df25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b399c74-1411-408a-b4cd-84e268ae83fe", "external-id": "nsx-vlan-transportzone-486", "segmentation_id": 486, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd30ee0e5-09", "ovs_interfaceid": "d30ee0e5-094a-4fb0-87c0-6d7ae4400745", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1262.428505] env[68964]: DEBUG oslo_concurrency.lockutils [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] Acquired lock "refresh_cache-ec476af0-9287-4f82-a4cd-c2a3771f1b68" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1262.428933] env[68964]: DEBUG nova.network.neutron [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Refreshing network info cache for port d30ee0e5-094a-4fb0-87c0-6d7ae4400745 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1262.430322] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9d:fc:38', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6b399c74-1411-408a-b4cd-84e268ae83fe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd30ee0e5-094a-4fb0-87c0-6d7ae4400745', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1262.438685] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Creating folder: Project (722d3fee2ed441a798e10ac2f905df25). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1262.441922] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f748ad6e-ac2b-4315-9090-dedf870392af {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.453592] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Created folder: Project (722d3fee2ed441a798e10ac2f905df25) in parent group-v684465. [ 1262.453592] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Creating folder: Instances. Parent ref: group-v684566. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1262.453782] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2268068f-e0b6-460b-ba51-3d11efdc00b8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.461056] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Created folder: Instances in parent group-v684566. [ 1262.461282] env[68964]: DEBUG oslo.service.loopingcall [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1262.461465] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1262.461650] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b9d2efae-3db7-4d0c-8dcc-bf76c7b790a9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.479523] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1262.479523] env[68964]: value = "task-3431688" [ 1262.479523] env[68964]: _type = "Task" [ 1262.479523] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1262.491236] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431688, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1262.731804] env[68964]: DEBUG nova.network.neutron [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Updated VIF entry in instance network info cache for port d30ee0e5-094a-4fb0-87c0-6d7ae4400745. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1262.732121] env[68964]: DEBUG nova.network.neutron [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Updating instance_info_cache with network_info: [{"id": "d30ee0e5-094a-4fb0-87c0-6d7ae4400745", "address": "fa:16:3e:9d:fc:38", "network": {"id": "f22cadb9-32a6-4c38-a3b4-061e25205774", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1032554889-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "722d3fee2ed441a798e10ac2f905df25", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b399c74-1411-408a-b4cd-84e268ae83fe", "external-id": "nsx-vlan-transportzone-486", "segmentation_id": 486, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd30ee0e5-09", "ovs_interfaceid": "d30ee0e5-094a-4fb0-87c0-6d7ae4400745", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1262.745522] env[68964]: DEBUG oslo_concurrency.lockutils [req-0f7ba3fb-e698-435d-808d-0bd9db627eef req-806fdb03-5534-4a57-84bc-3d3e86cd09d4 service nova] Releasing lock "refresh_cache-ec476af0-9287-4f82-a4cd-c2a3771f1b68" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1262.989241] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431688, 'name': CreateVM_Task, 'duration_secs': 0.266212} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1262.989422] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1262.999329] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1262.999523] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1262.999833] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1263.000089] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6bf1807f-4b75-4d30-a480-8773074f47ed {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1263.004468] env[68964]: DEBUG oslo_vmware.api [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Waiting for the task: (returnval){ [ 1263.004468] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ba9068-0b20-488c-6e4a-dd453cbb1bff" [ 1263.004468] env[68964]: _type = "Task" [ 1263.004468] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1263.012376] env[68964]: DEBUG oslo_vmware.api [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ba9068-0b20-488c-6e4a-dd453cbb1bff, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1263.516060] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1263.516060] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1263.516060] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1265.780200] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1265.780501] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1266.532387] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquiring lock "96c1b70b-9a17-46b1-999d-558b85c77d22" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1266.724012] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1267.719564] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1267.725048] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1268.394797] env[68964]: DEBUG oslo_concurrency.lockutils [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "a317d842-0282-4ace-a457-d8031cf0adca" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.724774] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1268.724774] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1268.724774] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1268.747017] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.747726] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.747726] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.747726] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.747938] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.747938] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.748124] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.748988] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.748988] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 704ec14b-410e-4175-b032-69074b332d87] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.748988] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1268.748988] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1268.748988] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1268.761946] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.762257] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.762431] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.762586] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1268.763735] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdbee04f-3a47-4ab1-bf4c-a0215ac28856 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.773100] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ccb6dc4-7101-4d19-9da6-f898f03237c8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.787500] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45ea006c-7dff-44e8-845a-e193e15d078a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.793721] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0ac374a-e3bc-42e2-9aaa-62d7a969bc0c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.823287] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180922MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1268.823453] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.823646] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.900056] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.900230] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 238794bb-9995-4bb0-954d-7ca0ef825e19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.900358] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4d272615-e2dd-4540-88d0-4a209f559147 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.900479] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.900595] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 244140d1-bf22-415a-b770-05f2fe106149 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.900851] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96c1b70b-9a17-46b1-999d-558b85c77d22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.900851] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a317d842-0282-4ace-a457-d8031cf0adca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.900958] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.901057] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 704ec14b-410e-4175-b032-69074b332d87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.901161] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1268.912229] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1268.924958] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1268.934966] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 15faae57-ab24-417e-9bf2-1aee11ccc2f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1268.944588] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1268.953784] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c7e9acc0-1427-4382-bcf8-99fdcc08aac0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1268.962515] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e437a43d-00b7-4feb-ae97-215238cf845b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1268.971485] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 43a9c974-399f-44cb-b836-4bdf17a8d768 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1268.980128] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 530b7cf0-53d3-4bfb-b545-b0bb57dc91b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1268.988982] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance db7c702e-3d49-4eb7-9d7e-c715186f1f78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1269.002145] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 178b3fc0-3f93-400f-ba19-d9703c62fd22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1269.011546] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f64ec65-fc61-4c44-a489-a36b9f8750e2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1269.021207] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1269.021456] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1269.021609] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1269.270102] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ab9670e-2121-440d-8f6f-ec698339d81e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.278026] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7c2cb7d-3d55-4c22-854b-d8bd50d0da18 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.307565] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae660cde-1b35-41c0-aa75-91785508cbeb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.314463] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09c86088-5a67-41a0-9b34-7e91261819a8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.330292] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1269.340762] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1269.356486] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1269.356817] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.533s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1269.481337] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "8ee9e517-075e-4faf-9740-32f8fa585eb5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1270.333135] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1270.333135] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1270.333135] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1282.295264] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "704ec14b-410e-4175-b032-69074b332d87" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1290.744826] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquiring lock "3d41d454-f370-46a6-ba97-17f5553d557c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1290.744826] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "3d41d454-f370-46a6-ba97-17f5553d557c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1293.106405] env[68964]: WARNING oslo_vmware.rw_handles [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1293.106405] env[68964]: ERROR oslo_vmware.rw_handles [ 1293.107118] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/ad25e5bc-2bd3-4e5e-895f-47a9ea285100/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore2 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1293.109479] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1293.110140] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Copying Virtual Disk [datastore2] vmware_temp/ad25e5bc-2bd3-4e5e-895f-47a9ea285100/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore2] vmware_temp/ad25e5bc-2bd3-4e5e-895f-47a9ea285100/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1293.110140] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-64a235cd-a6c3-4912-ac66-cae7c7d20564 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.118874] env[68964]: DEBUG oslo_vmware.api [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 1293.118874] env[68964]: value = "task-3431689" [ 1293.118874] env[68964]: _type = "Task" [ 1293.118874] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1293.127023] env[68964]: DEBUG oslo_vmware.api [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': task-3431689, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1293.629794] env[68964]: DEBUG oslo_vmware.exceptions [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1293.630187] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1293.630749] env[68964]: ERROR nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1293.630749] env[68964]: Faults: ['InvalidArgument'] [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] Traceback (most recent call last): [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] yield resources [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] self.driver.spawn(context, instance, image_meta, [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] self._fetch_image_if_missing(context, vi) [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] image_cache(vi, tmp_image_ds_loc) [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] vm_util.copy_virtual_disk( [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] session._wait_for_task(vmdk_copy_task) [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] return self.wait_for_task(task_ref) [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] return evt.wait() [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] result = hub.switch() [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] return self.greenlet.switch() [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] self.f(*self.args, **self.kw) [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] raise exceptions.translate_fault(task_info.error) [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] Faults: ['InvalidArgument'] [ 1293.630749] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] [ 1293.631859] env[68964]: INFO nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Terminating instance [ 1293.634709] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1293.634977] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1293.635800] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94fa02d9-6d10-4847-8823-f97f6b36debd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.642547] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1293.642827] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-528070f4-59f7-49bc-be98-2f2c58e358a9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.710315] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1293.710463] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Deleting contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1293.710759] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Deleting the datastore file [datastore2] a317d842-0282-4ace-a457-d8031cf0adca {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1293.710907] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e19df98f-c670-4565-98dc-fd3070e96879 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1293.718368] env[68964]: DEBUG oslo_vmware.api [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 1293.718368] env[68964]: value = "task-3431691" [ 1293.718368] env[68964]: _type = "Task" [ 1293.718368] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1293.726758] env[68964]: DEBUG oslo_vmware.api [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': task-3431691, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1294.229569] env[68964]: DEBUG oslo_vmware.api [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': task-3431691, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069492} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1294.229929] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1294.229979] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Deleted contents of the VM from datastore datastore2 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1294.230154] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1294.230318] env[68964]: INFO nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1294.232811] env[68964]: DEBUG nova.compute.claims [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1294.232984] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1294.233212] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1294.526432] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bf25b4d-f444-4155-842b-e75fc96f6767 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1294.534051] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29ab1d1f-a906-4758-9459-1b0ee4dcc429 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1294.564036] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1d56e66-eea3-4742-a815-7c98d76aaac5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1294.570913] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30645a3a-2e8e-4f50-9f18-4917f3a86d81 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1294.583900] env[68964]: DEBUG nova.compute.provider_tree [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1294.592215] env[68964]: DEBUG nova.scheduler.client.report [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1294.605643] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.372s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1294.606181] env[68964]: ERROR nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1294.606181] env[68964]: Faults: ['InvalidArgument'] [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] Traceback (most recent call last): [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] self.driver.spawn(context, instance, image_meta, [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] self._fetch_image_if_missing(context, vi) [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] image_cache(vi, tmp_image_ds_loc) [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] vm_util.copy_virtual_disk( [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] session._wait_for_task(vmdk_copy_task) [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] return self.wait_for_task(task_ref) [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] return evt.wait() [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] result = hub.switch() [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] return self.greenlet.switch() [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] self.f(*self.args, **self.kw) [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] raise exceptions.translate_fault(task_info.error) [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] Faults: ['InvalidArgument'] [ 1294.606181] env[68964]: ERROR nova.compute.manager [instance: a317d842-0282-4ace-a457-d8031cf0adca] [ 1294.607158] env[68964]: DEBUG nova.compute.utils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1294.608334] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Build of instance a317d842-0282-4ace-a457-d8031cf0adca was re-scheduled: A specified parameter was not correct: fileType [ 1294.608334] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1294.608708] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1294.608879] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1294.609067] env[68964]: DEBUG nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1294.609265] env[68964]: DEBUG nova.network.neutron [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1295.206942] env[68964]: DEBUG nova.network.neutron [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1295.228089] env[68964]: INFO nova.compute.manager [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Took 0.62 seconds to deallocate network for instance. [ 1295.335878] env[68964]: INFO nova.scheduler.client.report [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Deleted allocations for instance a317d842-0282-4ace-a457-d8031cf0adca [ 1295.356931] env[68964]: DEBUG oslo_concurrency.lockutils [None req-14645e2b-ec95-4474-83ab-e34c8ead17b8 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "a317d842-0282-4ace-a457-d8031cf0adca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.574s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1295.358506] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "a317d842-0282-4ace-a457-d8031cf0adca" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 47.736s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1295.358695] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a317d842-0282-4ace-a457-d8031cf0adca] During sync_power_state the instance has a pending task (spawning). Skip. [ 1295.358935] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "a317d842-0282-4ace-a457-d8031cf0adca" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1295.359475] env[68964]: DEBUG oslo_concurrency.lockutils [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "a317d842-0282-4ace-a457-d8031cf0adca" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 26.965s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1295.359634] env[68964]: DEBUG oslo_concurrency.lockutils [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "a317d842-0282-4ace-a457-d8031cf0adca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1295.359980] env[68964]: DEBUG oslo_concurrency.lockutils [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "a317d842-0282-4ace-a457-d8031cf0adca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1295.359980] env[68964]: DEBUG oslo_concurrency.lockutils [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "a317d842-0282-4ace-a457-d8031cf0adca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1295.361850] env[68964]: INFO nova.compute.manager [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Terminating instance [ 1295.364214] env[68964]: DEBUG nova.compute.manager [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1295.364400] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1295.364714] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-20b3acf5-78fc-40cc-a0f1-d7d548687389 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1295.367900] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1295.374621] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8fbd256-4ca5-4935-b982-ca47b4ea4893 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1295.404092] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a317d842-0282-4ace-a457-d8031cf0adca could not be found. [ 1295.404243] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1295.404396] env[68964]: INFO nova.compute.manager [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1295.404666] env[68964]: DEBUG oslo.service.loopingcall [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1295.409149] env[68964]: DEBUG nova.compute.manager [-] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1295.409149] env[68964]: DEBUG nova.network.neutron [-] [instance: a317d842-0282-4ace-a457-d8031cf0adca] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1295.427570] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1295.427813] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1295.429281] env[68964]: INFO nova.compute.claims [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1295.450937] env[68964]: DEBUG nova.network.neutron [-] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1295.465215] env[68964]: INFO nova.compute.manager [-] [instance: a317d842-0282-4ace-a457-d8031cf0adca] Took 0.06 seconds to deallocate network for instance. [ 1295.559623] env[68964]: DEBUG oslo_concurrency.lockutils [None req-09ce6cfb-05d5-4b3c-ba20-c3b5226f3731 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "a317d842-0282-4ace-a457-d8031cf0adca" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.200s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1295.731425] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d947518a-ccad-4b14-8be3-a205261b94f4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1295.740480] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bbc0cdb-5bc0-49e9-ac91-e574172bd9a9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1295.771943] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0546f980-c262-4117-a5b8-0485aa5dd21c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1295.779323] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aedb8448-bfe5-4c0b-a9f5-1ea754221696 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1295.793374] env[68964]: DEBUG nova.compute.provider_tree [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1295.802373] env[68964]: DEBUG nova.scheduler.client.report [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1295.816015] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.388s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1295.816490] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1295.847399] env[68964]: DEBUG nova.compute.utils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1295.849044] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1295.849156] env[68964]: DEBUG nova.network.neutron [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1295.858113] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1295.920997] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1295.926381] env[68964]: DEBUG nova.policy [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f8b14c780659465881b97fe26cdbdb60', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7700973a8cd04067994a0dabd569727c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1295.943911] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1295.944154] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1295.944302] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1295.944482] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1295.944646] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1295.944805] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1295.945025] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1295.945187] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1295.945352] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1295.945513] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1295.945681] env[68964]: DEBUG nova.virt.hardware [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1295.946546] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-352b1020-a6a6-4bb1-a591-2ace1aa936e2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1295.954119] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1666e16-c198-4924-97da-dc5a70c2fbbb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1296.307646] env[68964]: DEBUG nova.network.neutron [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Successfully created port: 6bbef264-d5c8-4b36-b383-5511d1f31139 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1296.375326] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "dc39aed1-9371-469b-b43e-40ce313c8ab3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1296.375628] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1297.098406] env[68964]: DEBUG nova.network.neutron [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Successfully updated port: 6bbef264-d5c8-4b36-b383-5511d1f31139 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1297.114731] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquiring lock "refresh_cache-1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1297.114868] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquired lock "refresh_cache-1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1297.115027] env[68964]: DEBUG nova.network.neutron [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1297.162958] env[68964]: DEBUG nova.network.neutron [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1297.391713] env[68964]: DEBUG nova.compute.manager [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Received event network-vif-plugged-6bbef264-d5c8-4b36-b383-5511d1f31139 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1297.391977] env[68964]: DEBUG oslo_concurrency.lockutils [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] Acquiring lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1297.392248] env[68964]: DEBUG oslo_concurrency.lockutils [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] Lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1297.392293] env[68964]: DEBUG oslo_concurrency.lockutils [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] Lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1297.392429] env[68964]: DEBUG nova.compute.manager [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] No waiting events found dispatching network-vif-plugged-6bbef264-d5c8-4b36-b383-5511d1f31139 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1297.392641] env[68964]: WARNING nova.compute.manager [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Received unexpected event network-vif-plugged-6bbef264-d5c8-4b36-b383-5511d1f31139 for instance with vm_state building and task_state spawning. [ 1297.392739] env[68964]: DEBUG nova.compute.manager [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Received event network-changed-6bbef264-d5c8-4b36-b383-5511d1f31139 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1297.392887] env[68964]: DEBUG nova.compute.manager [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Refreshing instance network info cache due to event network-changed-6bbef264-d5c8-4b36-b383-5511d1f31139. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1297.394107] env[68964]: DEBUG oslo_concurrency.lockutils [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] Acquiring lock "refresh_cache-1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1297.405157] env[68964]: DEBUG nova.network.neutron [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Updating instance_info_cache with network_info: [{"id": "6bbef264-d5c8-4b36-b383-5511d1f31139", "address": "fa:16:3e:a0:37:89", "network": {"id": "1d5ce5e3-f4f8-41a1-907b-b827bdd8c2e8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-756245163-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7700973a8cd04067994a0dabd569727c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6bbef264-d5", "ovs_interfaceid": "6bbef264-d5c8-4b36-b383-5511d1f31139", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1297.417862] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Releasing lock "refresh_cache-1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1297.418172] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Instance network_info: |[{"id": "6bbef264-d5c8-4b36-b383-5511d1f31139", "address": "fa:16:3e:a0:37:89", "network": {"id": "1d5ce5e3-f4f8-41a1-907b-b827bdd8c2e8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-756245163-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7700973a8cd04067994a0dabd569727c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6bbef264-d5", "ovs_interfaceid": "6bbef264-d5c8-4b36-b383-5511d1f31139", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1297.418461] env[68964]: DEBUG oslo_concurrency.lockutils [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] Acquired lock "refresh_cache-1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1297.418637] env[68964]: DEBUG nova.network.neutron [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Refreshing network info cache for port 6bbef264-d5c8-4b36-b383-5511d1f31139 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1297.419654] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a0:37:89', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '77aa121f-8fb6-42f3-aaea-43addfe449b2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6bbef264-d5c8-4b36-b383-5511d1f31139', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1297.427574] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Creating folder: Project (7700973a8cd04067994a0dabd569727c). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1297.430928] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-18357529-47d6-4ea4-938e-f079ea57c396 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.441528] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Created folder: Project (7700973a8cd04067994a0dabd569727c) in parent group-v684465. [ 1297.441712] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Creating folder: Instances. Parent ref: group-v684569. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1297.441931] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-56f4b3e9-7eec-460f-b3f1-98e8bdbb9dec {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.450848] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Created folder: Instances in parent group-v684569. [ 1297.451090] env[68964]: DEBUG oslo.service.loopingcall [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1297.451269] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1297.451458] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-884f9241-7b23-42e4-9720-a6d2fb35d0c4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.471906] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1297.471906] env[68964]: value = "task-3431694" [ 1297.471906] env[68964]: _type = "Task" [ 1297.471906] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1297.482103] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431694, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1297.755248] env[68964]: DEBUG nova.network.neutron [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Updated VIF entry in instance network info cache for port 6bbef264-d5c8-4b36-b383-5511d1f31139. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1297.755618] env[68964]: DEBUG nova.network.neutron [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Updating instance_info_cache with network_info: [{"id": "6bbef264-d5c8-4b36-b383-5511d1f31139", "address": "fa:16:3e:a0:37:89", "network": {"id": "1d5ce5e3-f4f8-41a1-907b-b827bdd8c2e8", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-756245163-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7700973a8cd04067994a0dabd569727c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6bbef264-d5", "ovs_interfaceid": "6bbef264-d5c8-4b36-b383-5511d1f31139", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1297.766315] env[68964]: DEBUG oslo_concurrency.lockutils [req-ceae367b-9ec8-4ca4-a90c-0bfb984e1870 req-47f455c4-8194-4628-b19f-7df53f10ff72 service nova] Releasing lock "refresh_cache-1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1297.981597] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431694, 'name': CreateVM_Task, 'duration_secs': 0.28261} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1297.981873] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1297.982533] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1297.982702] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1297.983020] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1297.983268] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0e23ba05-7ee1-4f36-bae9-c09893c4c701 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.987712] env[68964]: DEBUG oslo_vmware.api [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Waiting for the task: (returnval){ [ 1297.987712] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5283a835-b690-37c4-a2f4-f2c5cc499dc6" [ 1297.987712] env[68964]: _type = "Task" [ 1297.987712] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1297.994942] env[68964]: DEBUG oslo_vmware.api [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5283a835-b690-37c4-a2f4-f2c5cc499dc6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1298.498846] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1298.499150] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1298.499340] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1300.615042] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2c394477-3fc3-4e69-9c83-c3c767d95fe3 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "fbf98cba-22ba-4ad6-8d97-59bbbbf56e90" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1300.615323] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2c394477-3fc3-4e69-9c83-c3c767d95fe3 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "fbf98cba-22ba-4ad6-8d97-59bbbbf56e90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1307.057816] env[68964]: WARNING oslo_vmware.rw_handles [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1307.057816] env[68964]: ERROR oslo_vmware.rw_handles [ 1307.058338] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/225e0133-5d01-4e7b-9950-3d6ef5707a2f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1307.060554] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1307.060760] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Copying Virtual Disk [datastore1] vmware_temp/225e0133-5d01-4e7b-9950-3d6ef5707a2f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/225e0133-5d01-4e7b-9950-3d6ef5707a2f/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1307.061218] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f4da95f8-dd80-4722-bd77-bf2edfe70886 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.069743] env[68964]: DEBUG oslo_vmware.api [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Waiting for the task: (returnval){ [ 1307.069743] env[68964]: value = "task-3431695" [ 1307.069743] env[68964]: _type = "Task" [ 1307.069743] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1307.077791] env[68964]: DEBUG oslo_vmware.api [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Task: {'id': task-3431695, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1307.580157] env[68964]: DEBUG oslo_vmware.exceptions [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1307.580405] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1307.580973] env[68964]: ERROR nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1307.580973] env[68964]: Faults: ['InvalidArgument'] [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Traceback (most recent call last): [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] yield resources [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] self.driver.spawn(context, instance, image_meta, [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] self._fetch_image_if_missing(context, vi) [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] image_cache(vi, tmp_image_ds_loc) [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] vm_util.copy_virtual_disk( [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] session._wait_for_task(vmdk_copy_task) [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] return self.wait_for_task(task_ref) [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] return evt.wait() [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] result = hub.switch() [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] return self.greenlet.switch() [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] self.f(*self.args, **self.kw) [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] raise exceptions.translate_fault(task_info.error) [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Faults: ['InvalidArgument'] [ 1307.580973] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] [ 1307.581655] env[68964]: INFO nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Terminating instance [ 1307.582841] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1307.583932] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1307.584589] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1307.584835] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1307.585123] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cefc6a82-97f8-4743-8845-bef203acb767 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.587926] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-810d8d24-f9f7-4e02-8ba9-8997c79009c2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.594759] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1307.595087] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-38b1dca6-1675-4331-b558-145380163480 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.597246] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1307.597418] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1307.598394] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-75043ee4-919d-4cce-abc8-bb04dfaa98ea {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.605305] env[68964]: DEBUG oslo_vmware.api [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Waiting for the task: (returnval){ [ 1307.605305] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52310eff-9a5b-303e-869b-13bb063eab5f" [ 1307.605305] env[68964]: _type = "Task" [ 1307.605305] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1307.614384] env[68964]: DEBUG oslo_vmware.api [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52310eff-9a5b-303e-869b-13bb063eab5f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1307.658554] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1307.658815] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1307.659054] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Deleting the datastore file [datastore1] 2d0469ba-ad42-4b06-ade2-cd64487278c5 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1307.659327] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ff1dca6b-ebe5-41a5-903a-36e282f4717f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1307.665544] env[68964]: DEBUG oslo_vmware.api [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Waiting for the task: (returnval){ [ 1307.665544] env[68964]: value = "task-3431697" [ 1307.665544] env[68964]: _type = "Task" [ 1307.665544] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1307.673517] env[68964]: DEBUG oslo_vmware.api [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Task: {'id': task-3431697, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1308.115917] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1308.116221] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Creating directory with path [datastore1] vmware_temp/649d76b3-1534-4c67-ba05-694999efe60c/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1308.116463] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-437b52f4-b098-4cbb-87bf-bf8a13eaab27 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.127801] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Created directory with path [datastore1] vmware_temp/649d76b3-1534-4c67-ba05-694999efe60c/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1308.128050] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Fetch image to [datastore1] vmware_temp/649d76b3-1534-4c67-ba05-694999efe60c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1308.128234] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/649d76b3-1534-4c67-ba05-694999efe60c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1308.128978] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6bf520e-3807-4685-aeb4-ab456b8f25a6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.135908] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e289814e-0d4b-4fc7-992a-6166ec89249e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.145182] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-849e32e4-f9ff-432a-b214-4d49238a9ecd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.179016] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f31020d5-c760-4cf3-a7e1-3c6383929a57 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.186130] env[68964]: DEBUG oslo_vmware.api [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Task: {'id': task-3431697, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077461} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1308.187613] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1308.187865] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1308.188037] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1308.188223] env[68964]: INFO nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1308.190265] env[68964]: DEBUG nova.compute.claims [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1308.190430] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1308.190640] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.193077] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6371142f-8650-4206-bdd3-391d76dc713e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.216876] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1308.274713] env[68964]: DEBUG oslo_vmware.rw_handles [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/649d76b3-1534-4c67-ba05-694999efe60c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1308.333403] env[68964]: DEBUG oslo_vmware.rw_handles [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1308.333597] env[68964]: DEBUG oslo_vmware.rw_handles [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/649d76b3-1534-4c67-ba05-694999efe60c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1308.547103] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e8a3d86-e427-40d9-9c38-4215e59eea51 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.554034] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9972aac-a971-4e04-8d36-752f58c9a893 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.583257] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0325ca79-8d63-4ebb-83ba-097b593f2bbf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.589706] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a838143-f439-42ba-8647-b27b8d673f9d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.602374] env[68964]: DEBUG nova.compute.provider_tree [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1308.611050] env[68964]: DEBUG nova.scheduler.client.report [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1308.624902] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.434s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.625466] env[68964]: ERROR nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1308.625466] env[68964]: Faults: ['InvalidArgument'] [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Traceback (most recent call last): [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] self.driver.spawn(context, instance, image_meta, [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] self._fetch_image_if_missing(context, vi) [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] image_cache(vi, tmp_image_ds_loc) [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] vm_util.copy_virtual_disk( [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] session._wait_for_task(vmdk_copy_task) [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] return self.wait_for_task(task_ref) [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] return evt.wait() [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] result = hub.switch() [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] return self.greenlet.switch() [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] self.f(*self.args, **self.kw) [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] raise exceptions.translate_fault(task_info.error) [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Faults: ['InvalidArgument'] [ 1308.625466] env[68964]: ERROR nova.compute.manager [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] [ 1308.626250] env[68964]: DEBUG nova.compute.utils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1308.627507] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Build of instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 was re-scheduled: A specified parameter was not correct: fileType [ 1308.627507] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1308.627876] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1308.628092] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1308.628273] env[68964]: DEBUG nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1308.628430] env[68964]: DEBUG nova.network.neutron [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1309.027845] env[68964]: DEBUG nova.network.neutron [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1309.037660] env[68964]: INFO nova.compute.manager [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Took 0.41 seconds to deallocate network for instance. [ 1309.124210] env[68964]: INFO nova.scheduler.client.report [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Deleted allocations for instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 [ 1309.148386] env[68964]: DEBUG oslo_concurrency.lockutils [None req-3bfee58f-db63-46a6-9418-69cb5fb886dc tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 435.933s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1309.149235] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 239.406s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1309.149444] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Acquiring lock "2d0469ba-ad42-4b06-ade2-cd64487278c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1309.149671] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1309.149846] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1309.152134] env[68964]: INFO nova.compute.manager [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Terminating instance [ 1309.153764] env[68964]: DEBUG nova.compute.manager [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1309.154091] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1309.155049] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-53794622-d7d8-4bdb-ac0d-14ca3f46a736 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.163032] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1309.166743] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb3478b8-6c2e-41a6-af51-02ce5362434c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.202028] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2d0469ba-ad42-4b06-ade2-cd64487278c5 could not be found. [ 1309.202711] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1309.202711] env[68964]: INFO nova.compute.manager [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1309.202711] env[68964]: DEBUG oslo.service.loopingcall [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1309.208067] env[68964]: DEBUG nova.compute.manager [-] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1309.208204] env[68964]: DEBUG nova.network.neutron [-] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1309.221499] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1309.221729] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1309.223176] env[68964]: INFO nova.compute.claims [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1309.249708] env[68964]: DEBUG nova.network.neutron [-] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1309.259136] env[68964]: INFO nova.compute.manager [-] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] Took 0.05 seconds to deallocate network for instance. [ 1309.355165] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f92e769b-7175-417a-a10e-37215612855e tempest-ServerRescueTestJSONUnderV235-6081560 tempest-ServerRescueTestJSONUnderV235-6081560-project-member] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.206s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1309.356078] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 61.735s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1309.356307] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2d0469ba-ad42-4b06-ade2-cd64487278c5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1309.356538] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "2d0469ba-ad42-4b06-ade2-cd64487278c5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1309.530278] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aad01fd-cbe6-4866-bd07-64131b0f167c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.537945] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc5a1274-e494-4074-9a10-05210f0dd703 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.567870] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ae5aaec-c4c5-4982-9494-1a69da21b395 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.574831] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c029a946-f1d8-4a60-b982-3652bcd87f90 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.589067] env[68964]: DEBUG nova.compute.provider_tree [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1309.598095] env[68964]: DEBUG nova.scheduler.client.report [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1309.610601] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.389s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1309.611090] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1309.639912] env[68964]: DEBUG nova.compute.utils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1309.641614] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1309.641687] env[68964]: DEBUG nova.network.neutron [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1309.650533] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1309.697929] env[68964]: DEBUG nova.policy [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '58614d4e7f4547a6b1c81836a3f0b85f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1fad14028c3743529989ce4546c129e4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1309.707989] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1309.732679] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1309.732923] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1309.733094] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1309.733282] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1309.733428] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1309.733614] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1309.733831] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1309.734007] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1309.734186] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1309.734352] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1309.734525] env[68964]: DEBUG nova.virt.hardware [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1309.735378] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ce757ba-9526-41d3-92ea-dc8496b3e546 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1309.743013] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-767aee6f-061c-4b37-8129-3e3c562337d1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1310.075296] env[68964]: DEBUG nova.network.neutron [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Successfully created port: ecfbb28e-593f-4b75-8188-76dd2d996c9c {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1310.800689] env[68964]: DEBUG nova.network.neutron [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Successfully updated port: ecfbb28e-593f-4b75-8188-76dd2d996c9c {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1310.810920] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "refresh_cache-1b41b7f3-3ae4-48ca-aefc-5563060199d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1310.811391] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquired lock "refresh_cache-1b41b7f3-3ae4-48ca-aefc-5563060199d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1310.811391] env[68964]: DEBUG nova.network.neutron [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1310.851031] env[68964]: DEBUG nova.network.neutron [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1311.052786] env[68964]: DEBUG nova.network.neutron [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Updating instance_info_cache with network_info: [{"id": "ecfbb28e-593f-4b75-8188-76dd2d996c9c", "address": "fa:16:3e:ae:3d:65", "network": {"id": "5f59ccf0-e333-47f7-a18b-35c84760a1a6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1133141552-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1fad14028c3743529989ce4546c129e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapecfbb28e-59", "ovs_interfaceid": "ecfbb28e-593f-4b75-8188-76dd2d996c9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1311.067130] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Releasing lock "refresh_cache-1b41b7f3-3ae4-48ca-aefc-5563060199d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1311.067456] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Instance network_info: |[{"id": "ecfbb28e-593f-4b75-8188-76dd2d996c9c", "address": "fa:16:3e:ae:3d:65", "network": {"id": "5f59ccf0-e333-47f7-a18b-35c84760a1a6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1133141552-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1fad14028c3743529989ce4546c129e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapecfbb28e-59", "ovs_interfaceid": "ecfbb28e-593f-4b75-8188-76dd2d996c9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1311.067847] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ae:3d:65', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '09bf081b-cdf0-4977-abe2-2339a87409ab', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ecfbb28e-593f-4b75-8188-76dd2d996c9c', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1311.075314] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Creating folder: Project (1fad14028c3743529989ce4546c129e4). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1311.075880] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-06f0c4a3-ba55-4928-b8da-c40a1950d0c4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.087325] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Created folder: Project (1fad14028c3743529989ce4546c129e4) in parent group-v684465. [ 1311.087510] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Creating folder: Instances. Parent ref: group-v684572. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1311.087734] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3dce3047-7c0c-405c-a4d6-ee225501ca89 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.099855] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Created folder: Instances in parent group-v684572. [ 1311.100098] env[68964]: DEBUG oslo.service.loopingcall [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1311.100285] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1311.100480] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-57428fdd-5f30-4f14-8676-c075347b0494 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.119461] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1311.119461] env[68964]: value = "task-3431700" [ 1311.119461] env[68964]: _type = "Task" [ 1311.119461] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1311.127042] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431700, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1311.148774] env[68964]: DEBUG nova.compute.manager [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Received event network-vif-plugged-ecfbb28e-593f-4b75-8188-76dd2d996c9c {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1311.148774] env[68964]: DEBUG oslo_concurrency.lockutils [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] Acquiring lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1311.148849] env[68964]: DEBUG oslo_concurrency.lockutils [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] Lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1311.149021] env[68964]: DEBUG oslo_concurrency.lockutils [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] Lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1311.149431] env[68964]: DEBUG nova.compute.manager [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] No waiting events found dispatching network-vif-plugged-ecfbb28e-593f-4b75-8188-76dd2d996c9c {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1311.149576] env[68964]: WARNING nova.compute.manager [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Received unexpected event network-vif-plugged-ecfbb28e-593f-4b75-8188-76dd2d996c9c for instance with vm_state building and task_state spawning. [ 1311.149802] env[68964]: DEBUG nova.compute.manager [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Received event network-changed-ecfbb28e-593f-4b75-8188-76dd2d996c9c {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1311.149876] env[68964]: DEBUG nova.compute.manager [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Refreshing instance network info cache due to event network-changed-ecfbb28e-593f-4b75-8188-76dd2d996c9c. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1311.150083] env[68964]: DEBUG oslo_concurrency.lockutils [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] Acquiring lock "refresh_cache-1b41b7f3-3ae4-48ca-aefc-5563060199d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1311.150224] env[68964]: DEBUG oslo_concurrency.lockutils [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] Acquired lock "refresh_cache-1b41b7f3-3ae4-48ca-aefc-5563060199d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1311.150379] env[68964]: DEBUG nova.network.neutron [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Refreshing network info cache for port ecfbb28e-593f-4b75-8188-76dd2d996c9c {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1311.632386] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431700, 'name': CreateVM_Task, 'duration_secs': 0.288121} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1311.632560] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1311.633426] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1311.633667] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1311.634063] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1311.634315] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a7d541eb-0cf8-4315-bef5-0da28719d717 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1311.638796] env[68964]: DEBUG oslo_vmware.api [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Waiting for the task: (returnval){ [ 1311.638796] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]529b5a2d-a9a1-5431-2a97-aac18579ecda" [ 1311.638796] env[68964]: _type = "Task" [ 1311.638796] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1311.646720] env[68964]: DEBUG oslo_vmware.api [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]529b5a2d-a9a1-5431-2a97-aac18579ecda, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1311.830974] env[68964]: DEBUG nova.network.neutron [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Updated VIF entry in instance network info cache for port ecfbb28e-593f-4b75-8188-76dd2d996c9c. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1311.831328] env[68964]: DEBUG nova.network.neutron [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Updating instance_info_cache with network_info: [{"id": "ecfbb28e-593f-4b75-8188-76dd2d996c9c", "address": "fa:16:3e:ae:3d:65", "network": {"id": "5f59ccf0-e333-47f7-a18b-35c84760a1a6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1133141552-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1fad14028c3743529989ce4546c129e4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapecfbb28e-59", "ovs_interfaceid": "ecfbb28e-593f-4b75-8188-76dd2d996c9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1311.844443] env[68964]: DEBUG oslo_concurrency.lockutils [req-148367dc-4e11-4ff6-b516-6c4ba838991a req-5914eb10-9698-4c09-8e3a-1cd59e5cb763 service nova] Releasing lock "refresh_cache-1b41b7f3-3ae4-48ca-aefc-5563060199d5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1312.149862] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1312.150229] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1312.150711] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1313.969459] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1315.834820] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquiring lock "094b1346-f24b-4360-b7c8-46fd2f2c668f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1315.835125] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1319.786554] env[68964]: DEBUG oslo_concurrency.lockutils [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquiring lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1326.724913] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1327.720643] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1327.724304] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1327.724494] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1327.724648] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1328.725163] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1328.725465] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1328.725465] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1328.748589] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.748756] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.748876] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.749015] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.749214] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.749340] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.749463] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 704ec14b-410e-4175-b032-69074b332d87] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.749585] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.749702] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.749822] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1328.749942] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1329.724942] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1329.725183] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1330.724205] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1330.724457] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1330.736326] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1330.736602] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1330.736708] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1330.736893] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1330.738034] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b91015a1-d8e6-41b6-bdef-648ed47dcc24 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.748460] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8202fc09-253a-4301-8e30-fc27e7f344b7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.762229] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3e85261-eb69-4e84-b645-5f86b454e65b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.768239] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59c7ac6d-f3eb-47fe-8f01-be6ee2b223d7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.796764] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180955MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1330.796764] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1330.796919] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1330.870484] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 238794bb-9995-4bb0-954d-7ca0ef825e19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.870645] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4d272615-e2dd-4540-88d0-4a209f559147 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.870773] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.870895] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 244140d1-bf22-415a-b770-05f2fe106149 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.871027] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96c1b70b-9a17-46b1-999d-558b85c77d22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.871150] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.871266] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 704ec14b-410e-4175-b032-69074b332d87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.871381] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.871497] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.871613] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1330.881911] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 15faae57-ab24-417e-9bf2-1aee11ccc2f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.892016] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.900667] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c7e9acc0-1427-4382-bcf8-99fdcc08aac0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.910017] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e437a43d-00b7-4feb-ae97-215238cf845b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.918412] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 43a9c974-399f-44cb-b836-4bdf17a8d768 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.927171] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 530b7cf0-53d3-4bfb-b545-b0bb57dc91b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.938122] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance db7c702e-3d49-4eb7-9d7e-c715186f1f78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.948614] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 178b3fc0-3f93-400f-ba19-d9703c62fd22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.962079] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f64ec65-fc61-4c44-a489-a36b9f8750e2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.972351] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.981806] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3d41d454-f370-46a6-ba97-17f5553d557c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1330.991863] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1331.001497] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fbf98cba-22ba-4ad6-8d97-59bbbbf56e90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1331.011750] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1331.012038] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1331.012209] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1331.259934] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dce146a5-a58a-413e-a7ae-6782d76459da {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1331.267437] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e16a46c9-191a-4bbc-8513-d27d610260a8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1331.297034] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd505d23-dfaf-40af-bbc0-b7c555ba8b99 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1331.303722] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d538d8d4-4b00-45e8-8250-d47bd09f00f1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1331.316452] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1331.324924] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1331.356543] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1331.356744] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.560s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1332.353086] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1332.726315] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1358.127208] env[68964]: WARNING oslo_vmware.rw_handles [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1358.127208] env[68964]: ERROR oslo_vmware.rw_handles [ 1358.127827] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/649d76b3-1534-4c67-ba05-694999efe60c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1358.129937] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1358.130246] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Copying Virtual Disk [datastore1] vmware_temp/649d76b3-1534-4c67-ba05-694999efe60c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/649d76b3-1534-4c67-ba05-694999efe60c/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1358.130541] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2a045d69-028e-4011-86bb-f8435703a370 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.138341] env[68964]: DEBUG oslo_vmware.api [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Waiting for the task: (returnval){ [ 1358.138341] env[68964]: value = "task-3431701" [ 1358.138341] env[68964]: _type = "Task" [ 1358.138341] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1358.146152] env[68964]: DEBUG oslo_vmware.api [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Task: {'id': task-3431701, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1358.649298] env[68964]: DEBUG oslo_vmware.exceptions [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1358.649594] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1358.650155] env[68964]: ERROR nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1358.650155] env[68964]: Faults: ['InvalidArgument'] [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Traceback (most recent call last): [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] yield resources [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] self.driver.spawn(context, instance, image_meta, [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] self._fetch_image_if_missing(context, vi) [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] image_cache(vi, tmp_image_ds_loc) [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] vm_util.copy_virtual_disk( [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] session._wait_for_task(vmdk_copy_task) [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] return self.wait_for_task(task_ref) [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] return evt.wait() [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] result = hub.switch() [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] return self.greenlet.switch() [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] self.f(*self.args, **self.kw) [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] raise exceptions.translate_fault(task_info.error) [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Faults: ['InvalidArgument'] [ 1358.650155] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] [ 1358.651066] env[68964]: INFO nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Terminating instance [ 1358.652191] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1358.652466] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1358.652673] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4e589f9d-7cc5-47fb-922b-4a93d3a08b4b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.655114] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1358.655307] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1358.656058] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85b07e92-3da6-4ce7-95a6-78506b7aed29 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.663531] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1358.664651] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fcfbb8be-a5d8-40cd-b116-88d586b2f172 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.666136] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1358.666315] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1358.666994] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-db080b94-8183-428b-9b1a-0245f5fa44d9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.673411] env[68964]: DEBUG oslo_vmware.api [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Waiting for the task: (returnval){ [ 1358.673411] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]523e815e-f607-fcea-9738-a1debe49f895" [ 1358.673411] env[68964]: _type = "Task" [ 1358.673411] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1358.681125] env[68964]: DEBUG oslo_vmware.api [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]523e815e-f607-fcea-9738-a1debe49f895, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1358.765574] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1358.765806] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1358.765984] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Deleting the datastore file [datastore1] 238794bb-9995-4bb0-954d-7ca0ef825e19 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1358.766287] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fb3d1369-a16e-4d1c-af2d-816921aaff04 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1358.773552] env[68964]: DEBUG oslo_vmware.api [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Waiting for the task: (returnval){ [ 1358.773552] env[68964]: value = "task-3431703" [ 1358.773552] env[68964]: _type = "Task" [ 1358.773552] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1358.781078] env[68964]: DEBUG oslo_vmware.api [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Task: {'id': task-3431703, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1359.183960] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1359.184341] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Creating directory with path [datastore1] vmware_temp/db8d2312-4624-4382-94c8-bfc374a738e4/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1359.184509] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2da5b491-f3c6-49ed-a4e1-8c8ce2fdd961 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.196801] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Created directory with path [datastore1] vmware_temp/db8d2312-4624-4382-94c8-bfc374a738e4/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1359.196997] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Fetch image to [datastore1] vmware_temp/db8d2312-4624-4382-94c8-bfc374a738e4/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1359.197192] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/db8d2312-4624-4382-94c8-bfc374a738e4/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1359.197911] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3f52eac-713f-4897-bac2-d59d40403422 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.204480] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49ffc672-cea0-4537-8bdc-beffbe44fdc2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.213328] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-624b1be9-7034-45e2-98bc-4daf298a47de {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.244947] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a12f385-90e3-445b-a6dc-58f78297ad2e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.250535] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-723accf3-39fb-42f4-8daf-f040fe1f0596 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.272395] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1359.281688] env[68964]: DEBUG oslo_vmware.api [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Task: {'id': task-3431703, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.088319} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1359.281966] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1359.282189] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1359.282412] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1359.282624] env[68964]: INFO nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1359.285810] env[68964]: DEBUG nova.compute.claims [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1359.285980] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1359.286209] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1359.322353] env[68964]: DEBUG oslo_vmware.rw_handles [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/db8d2312-4624-4382-94c8-bfc374a738e4/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1359.382220] env[68964]: DEBUG oslo_vmware.rw_handles [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1359.383033] env[68964]: DEBUG oslo_vmware.rw_handles [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/db8d2312-4624-4382-94c8-bfc374a738e4/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1359.643030] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e10e3c-7f4f-4876-9144-fc19d7b42374 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.650311] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae6f458e-6b65-4bbd-9dad-7fcd659d0002 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.680204] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23c1f80f-9970-443d-975a-0cd914ec98b6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.688085] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30a99781-bcd2-4635-94bb-e7f5b077aa13 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.700680] env[68964]: DEBUG nova.compute.provider_tree [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1359.711193] env[68964]: DEBUG nova.scheduler.client.report [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1359.728129] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.442s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1359.728963] env[68964]: ERROR nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1359.728963] env[68964]: Faults: ['InvalidArgument'] [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Traceback (most recent call last): [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] self.driver.spawn(context, instance, image_meta, [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] self._fetch_image_if_missing(context, vi) [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] image_cache(vi, tmp_image_ds_loc) [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] vm_util.copy_virtual_disk( [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] session._wait_for_task(vmdk_copy_task) [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] return self.wait_for_task(task_ref) [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] return evt.wait() [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] result = hub.switch() [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] return self.greenlet.switch() [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] self.f(*self.args, **self.kw) [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] raise exceptions.translate_fault(task_info.error) [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Faults: ['InvalidArgument'] [ 1359.728963] env[68964]: ERROR nova.compute.manager [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] [ 1359.729846] env[68964]: DEBUG nova.compute.utils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1359.732810] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Build of instance 238794bb-9995-4bb0-954d-7ca0ef825e19 was re-scheduled: A specified parameter was not correct: fileType [ 1359.732810] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1359.733191] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1359.733367] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1359.733532] env[68964]: DEBUG nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1359.733694] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1360.368231] env[68964]: DEBUG nova.network.neutron [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1360.380846] env[68964]: INFO nova.compute.manager [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Took 0.65 seconds to deallocate network for instance. [ 1360.483262] env[68964]: INFO nova.scheduler.client.report [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Deleted allocations for instance 238794bb-9995-4bb0-954d-7ca0ef825e19 [ 1360.508420] env[68964]: DEBUG oslo_concurrency.lockutils [None req-762d31df-2428-4649-9715-b28ba3124911 tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 472.340s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1360.510308] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 275.174s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1360.510715] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Acquiring lock "238794bb-9995-4bb0-954d-7ca0ef825e19-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1360.512992] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1360.512992] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1360.514304] env[68964]: INFO nova.compute.manager [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Terminating instance [ 1360.516685] env[68964]: DEBUG nova.compute.manager [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1360.517009] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1360.524287] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-29fa9434-7aab-447a-bd32-cf1159991ded {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1360.524287] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 15faae57-ab24-417e-9bf2-1aee11ccc2f6] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1360.535410] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41376dba-220b-431b-8e57-e529488ff2cf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1360.549988] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 15faae57-ab24-417e-9bf2-1aee11ccc2f6] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1360.565396] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 238794bb-9995-4bb0-954d-7ca0ef825e19 could not be found. [ 1360.565587] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1360.565758] env[68964]: INFO nova.compute.manager [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1360.565993] env[68964]: DEBUG oslo.service.loopingcall [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1360.568169] env[68964]: DEBUG nova.compute.manager [-] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1360.568169] env[68964]: DEBUG nova.network.neutron [-] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1360.580356] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "15faae57-ab24-417e-9bf2-1aee11ccc2f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.986s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1360.589425] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1360.593401] env[68964]: DEBUG nova.network.neutron [-] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1360.599613] env[68964]: INFO nova.compute.manager [-] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] Took 0.03 seconds to deallocate network for instance. [ 1360.654777] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1360.655007] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1360.656510] env[68964]: INFO nova.compute.claims [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1360.708695] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1c9a5eac-98a3-48eb-8e66-fccbc74d632f tempest-ServersTestMultiNic-1776558078 tempest-ServersTestMultiNic-1776558078-project-member] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1360.710677] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 113.089s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1360.710677] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 238794bb-9995-4bb0-954d-7ca0ef825e19] During sync_power_state the instance has a pending task (deleting). Skip. [ 1360.710677] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "238794bb-9995-4bb0-954d-7ca0ef825e19" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1360.971026] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00b1f48b-d93a-4d56-83a8-cdbc6073fdfb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1360.981208] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a985e838-ec26-4a1e-a4ec-33999664d9e3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.010582] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bc17e9d-2263-4584-a85e-784dbb970197 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.017934] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0613f25a-f05b-477a-ae8e-888c95c605fd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.031177] env[68964]: DEBUG nova.compute.provider_tree [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1361.039741] env[68964]: DEBUG nova.scheduler.client.report [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1361.053237] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.398s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1361.053682] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1361.083278] env[68964]: DEBUG nova.compute.utils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1361.084407] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1361.084580] env[68964]: DEBUG nova.network.neutron [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1361.094412] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1361.156021] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1361.166856] env[68964]: DEBUG nova.policy [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4580824b98404c0f85c5c1395a980c9f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '28145fe8e6864e02b24fe4162fa45711', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1361.180025] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1361.180215] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1361.180373] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1361.180606] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1361.180784] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1361.180933] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1361.181167] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1361.181328] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1361.181506] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1361.181665] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1361.181832] env[68964]: DEBUG nova.virt.hardware [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1361.182798] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda7c7b3-4845-43ca-84d5-c9816e5905c6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.190952] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abe149a2-a8d6-40d6-825d-4634cf4bc035 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1361.476470] env[68964]: DEBUG nova.network.neutron [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Successfully created port: 23d65097-c051-4922-82ef-ef0ca42c5fb8 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1362.196043] env[68964]: DEBUG nova.network.neutron [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Successfully updated port: 23d65097-c051-4922-82ef-ef0ca42c5fb8 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1362.209669] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquiring lock "refresh_cache-07ea329b-3934-437a-8b44-57045e86c310" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1362.210258] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquired lock "refresh_cache-07ea329b-3934-437a-8b44-57045e86c310" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1362.210451] env[68964]: DEBUG nova.network.neutron [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1362.234200] env[68964]: DEBUG nova.compute.manager [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Received event network-vif-plugged-23d65097-c051-4922-82ef-ef0ca42c5fb8 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1362.234538] env[68964]: DEBUG oslo_concurrency.lockutils [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] Acquiring lock "07ea329b-3934-437a-8b44-57045e86c310-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1362.234665] env[68964]: DEBUG oslo_concurrency.lockutils [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] Lock "07ea329b-3934-437a-8b44-57045e86c310-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1362.234823] env[68964]: DEBUG oslo_concurrency.lockutils [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] Lock "07ea329b-3934-437a-8b44-57045e86c310-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1362.235033] env[68964]: DEBUG nova.compute.manager [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] [instance: 07ea329b-3934-437a-8b44-57045e86c310] No waiting events found dispatching network-vif-plugged-23d65097-c051-4922-82ef-ef0ca42c5fb8 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1362.235646] env[68964]: WARNING nova.compute.manager [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Received unexpected event network-vif-plugged-23d65097-c051-4922-82ef-ef0ca42c5fb8 for instance with vm_state building and task_state spawning. [ 1362.235862] env[68964]: DEBUG nova.compute.manager [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Received event network-changed-23d65097-c051-4922-82ef-ef0ca42c5fb8 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1362.236039] env[68964]: DEBUG nova.compute.manager [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Refreshing instance network info cache due to event network-changed-23d65097-c051-4922-82ef-ef0ca42c5fb8. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1362.236212] env[68964]: DEBUG oslo_concurrency.lockutils [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] Acquiring lock "refresh_cache-07ea329b-3934-437a-8b44-57045e86c310" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1362.252789] env[68964]: DEBUG nova.network.neutron [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1362.456613] env[68964]: DEBUG nova.network.neutron [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Updating instance_info_cache with network_info: [{"id": "23d65097-c051-4922-82ef-ef0ca42c5fb8", "address": "fa:16:3e:50:98:86", "network": {"id": "78dabd9f-bfac-49e9-9c38-e92e140d914f", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1409782726-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "28145fe8e6864e02b24fe4162fa45711", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c6eaa481-1f92-4851-b98e-09ed0daad7cb", "external-id": "nsx-vlan-transportzone-636", "segmentation_id": 636, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap23d65097-c0", "ovs_interfaceid": "23d65097-c051-4922-82ef-ef0ca42c5fb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1362.469578] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Releasing lock "refresh_cache-07ea329b-3934-437a-8b44-57045e86c310" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1362.469854] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Instance network_info: |[{"id": "23d65097-c051-4922-82ef-ef0ca42c5fb8", "address": "fa:16:3e:50:98:86", "network": {"id": "78dabd9f-bfac-49e9-9c38-e92e140d914f", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1409782726-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "28145fe8e6864e02b24fe4162fa45711", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c6eaa481-1f92-4851-b98e-09ed0daad7cb", "external-id": "nsx-vlan-transportzone-636", "segmentation_id": 636, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap23d65097-c0", "ovs_interfaceid": "23d65097-c051-4922-82ef-ef0ca42c5fb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1362.470183] env[68964]: DEBUG oslo_concurrency.lockutils [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] Acquired lock "refresh_cache-07ea329b-3934-437a-8b44-57045e86c310" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1362.470362] env[68964]: DEBUG nova.network.neutron [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Refreshing network info cache for port 23d65097-c051-4922-82ef-ef0ca42c5fb8 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1362.471317] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:50:98:86', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c6eaa481-1f92-4851-b98e-09ed0daad7cb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '23d65097-c051-4922-82ef-ef0ca42c5fb8', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1362.479256] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Creating folder: Project (28145fe8e6864e02b24fe4162fa45711). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1362.482000] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-88ed428e-254a-40d1-972e-a2e1c51237e7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.493102] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Created folder: Project (28145fe8e6864e02b24fe4162fa45711) in parent group-v684465. [ 1362.493289] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Creating folder: Instances. Parent ref: group-v684575. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1362.493508] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5727a9dd-4fed-40b4-a687-217b7ad5ec35 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.502889] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Created folder: Instances in parent group-v684575. [ 1362.503121] env[68964]: DEBUG oslo.service.loopingcall [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1362.503299] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1362.503489] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4cc63529-ec8f-4604-9636-cbc4155fca5e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1362.523523] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1362.523523] env[68964]: value = "task-3431706" [ 1362.523523] env[68964]: _type = "Task" [ 1362.523523] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1362.530925] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431706, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1362.728229] env[68964]: DEBUG nova.network.neutron [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Updated VIF entry in instance network info cache for port 23d65097-c051-4922-82ef-ef0ca42c5fb8. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1362.728629] env[68964]: DEBUG nova.network.neutron [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Updating instance_info_cache with network_info: [{"id": "23d65097-c051-4922-82ef-ef0ca42c5fb8", "address": "fa:16:3e:50:98:86", "network": {"id": "78dabd9f-bfac-49e9-9c38-e92e140d914f", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1409782726-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "28145fe8e6864e02b24fe4162fa45711", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c6eaa481-1f92-4851-b98e-09ed0daad7cb", "external-id": "nsx-vlan-transportzone-636", "segmentation_id": 636, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap23d65097-c0", "ovs_interfaceid": "23d65097-c051-4922-82ef-ef0ca42c5fb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1362.739501] env[68964]: DEBUG oslo_concurrency.lockutils [req-43c2510e-0ee4-437c-82ad-9c448ac40178 req-29ba0734-1887-4f24-a40f-da238c845430 service nova] Releasing lock "refresh_cache-07ea329b-3934-437a-8b44-57045e86c310" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1363.033655] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431706, 'name': CreateVM_Task, 'duration_secs': 0.280194} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1363.033822] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1363.034527] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1363.034682] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1363.034985] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1363.035241] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c5ee03b2-0eaa-41e1-a132-51b7d0bf6ae7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.039499] env[68964]: DEBUG oslo_vmware.api [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Waiting for the task: (returnval){ [ 1363.039499] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52231ff4-bd1e-ffde-6bae-090fb6781e7c" [ 1363.039499] env[68964]: _type = "Task" [ 1363.039499] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.046789] env[68964]: DEBUG oslo_vmware.api [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52231ff4-bd1e-ffde-6bae-090fb6781e7c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1363.550388] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1363.550664] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1363.550857] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1364.543752] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquiring lock "07ea329b-3934-437a-8b44-57045e86c310" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1387.725123] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1388.727120] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1388.727120] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1388.727120] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1388.750685] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.750900] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.751061] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.751197] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.751319] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.751439] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 704ec14b-410e-4175-b032-69074b332d87] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.751558] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.751677] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.751794] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.751908] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1388.752039] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1388.752584] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1388.752764] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1389.725060] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1389.725060] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1390.724625] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1390.724910] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1391.725083] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1391.725411] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1391.737245] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1391.737461] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1391.737655] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1391.737817] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1391.738948] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa267f84-dc0c-4f41-8a11-d9f2ee4ede7d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.748082] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cb39251-d3a9-448c-aab9-ec58ddc8f978 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.762723] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-193311e5-8497-4ed8-b97c-7de28ceb3174 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.769111] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec3fcb1a-200c-4456-a70a-e4242f178e69 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1391.798885] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180943MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1391.799030] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1391.799234] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1391.896476] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 4d272615-e2dd-4540-88d0-4a209f559147 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.896633] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.896759] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 244140d1-bf22-415a-b770-05f2fe106149 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.896883] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96c1b70b-9a17-46b1-999d-558b85c77d22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.897014] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.897136] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 704ec14b-410e-4175-b032-69074b332d87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.897254] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.897370] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.897483] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.897599] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1391.914845] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c7e9acc0-1427-4382-bcf8-99fdcc08aac0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1391.925672] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance e437a43d-00b7-4feb-ae97-215238cf845b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1391.936383] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 43a9c974-399f-44cb-b836-4bdf17a8d768 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1391.947170] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 530b7cf0-53d3-4bfb-b545-b0bb57dc91b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1391.956322] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance db7c702e-3d49-4eb7-9d7e-c715186f1f78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1391.966269] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 178b3fc0-3f93-400f-ba19-d9703c62fd22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1391.976139] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7f64ec65-fc61-4c44-a489-a36b9f8750e2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1391.986740] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1391.996664] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3d41d454-f370-46a6-ba97-17f5553d557c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1392.006775] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1392.017620] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fbf98cba-22ba-4ad6-8d97-59bbbbf56e90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1392.028640] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1392.028884] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1392.029046] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1392.280214] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b050ced5-d59e-4743-a79b-59b6ca328181 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.287956] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57bda2a3-dc1a-4481-8aba-aae628bbc4fb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.317371] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cd9cd9b-5728-4e10-8e0b-bf9db2df1a22 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.324061] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f07370c-b879-488a-951a-62ba5752d0fb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1392.336588] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1392.344792] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1392.362113] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1392.362303] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.563s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1408.929437] env[68964]: WARNING oslo_vmware.rw_handles [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1408.929437] env[68964]: ERROR oslo_vmware.rw_handles [ 1408.930057] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/db8d2312-4624-4382-94c8-bfc374a738e4/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1408.931776] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1408.932090] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Copying Virtual Disk [datastore1] vmware_temp/db8d2312-4624-4382-94c8-bfc374a738e4/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/db8d2312-4624-4382-94c8-bfc374a738e4/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1408.932393] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ee23e884-5ec1-4590-addc-19d9f6783a69 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1408.940702] env[68964]: DEBUG oslo_vmware.api [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Waiting for the task: (returnval){ [ 1408.940702] env[68964]: value = "task-3431707" [ 1408.940702] env[68964]: _type = "Task" [ 1408.940702] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1408.950034] env[68964]: DEBUG oslo_vmware.api [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Task: {'id': task-3431707, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1409.450879] env[68964]: DEBUG oslo_vmware.exceptions [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1409.451184] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1409.451741] env[68964]: ERROR nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1409.451741] env[68964]: Faults: ['InvalidArgument'] [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Traceback (most recent call last): [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] yield resources [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] self.driver.spawn(context, instance, image_meta, [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] self._fetch_image_if_missing(context, vi) [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] image_cache(vi, tmp_image_ds_loc) [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] vm_util.copy_virtual_disk( [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] session._wait_for_task(vmdk_copy_task) [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] return self.wait_for_task(task_ref) [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] return evt.wait() [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] result = hub.switch() [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] return self.greenlet.switch() [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] self.f(*self.args, **self.kw) [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] raise exceptions.translate_fault(task_info.error) [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Faults: ['InvalidArgument'] [ 1409.451741] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] [ 1409.452648] env[68964]: INFO nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Terminating instance [ 1409.453616] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1409.453835] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1409.454083] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4245b86f-56f9-4a74-9c85-4f049cd4398e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1409.456433] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1409.456618] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1409.457432] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feb829cb-c45c-42d3-b4eb-a4bb3f5522d7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1409.463905] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1409.464125] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-66b08139-c787-4d8d-b9ad-dfe267787ce7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1409.466219] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1409.466393] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1409.467331] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b1950b71-a31a-4b6f-894a-674ec9fba636 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1409.472302] env[68964]: DEBUG oslo_vmware.api [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Waiting for the task: (returnval){ [ 1409.472302] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ba86f8-26c6-9605-77ca-7af2b254001e" [ 1409.472302] env[68964]: _type = "Task" [ 1409.472302] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1409.485178] env[68964]: DEBUG oslo_vmware.api [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ba86f8-26c6-9605-77ca-7af2b254001e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1409.534210] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1409.534574] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1409.534924] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Deleting the datastore file [datastore1] 4d272615-e2dd-4540-88d0-4a209f559147 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1409.535278] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8ce80174-f15e-4eca-93a4-56337c673894 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1409.541234] env[68964]: DEBUG oslo_vmware.api [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Waiting for the task: (returnval){ [ 1409.541234] env[68964]: value = "task-3431709" [ 1409.541234] env[68964]: _type = "Task" [ 1409.541234] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1409.549462] env[68964]: DEBUG oslo_vmware.api [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Task: {'id': task-3431709, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1409.982352] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1409.982656] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Creating directory with path [datastore1] vmware_temp/82d052a8-3ccf-4413-abe6-d2cc371f6734/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1409.982889] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-584ac972-f1dd-4d51-b2fb-6b457cdb4833 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.017760] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Created directory with path [datastore1] vmware_temp/82d052a8-3ccf-4413-abe6-d2cc371f6734/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1410.017983] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Fetch image to [datastore1] vmware_temp/82d052a8-3ccf-4413-abe6-d2cc371f6734/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1410.018171] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/82d052a8-3ccf-4413-abe6-d2cc371f6734/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1410.018975] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ae51609-2e70-4281-86ca-1998f1d50392 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.026141] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c2097d0-9b89-44f1-9b75-91ff0f609846 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.035399] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31305387-72c1-4b84-ad44-57ef77152e8b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.072243] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87956440-134f-4a34-a28a-77ade406454d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.079242] env[68964]: DEBUG oslo_vmware.api [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Task: {'id': task-3431709, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064303} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1410.080725] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1410.080919] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1410.081110] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1410.081287] env[68964]: INFO nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1410.083070] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-90c9efcc-4ec6-4e15-9ce6-81183e197afe {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.084963] env[68964]: DEBUG nova.compute.claims [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1410.085148] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1410.085359] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1410.108794] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1410.318083] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed30520e-6ef6-4394-8fe5-05ba31a458b3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.322647] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1410.324200] env[68964]: ERROR nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image b0d1c28b-5c3d-4c47-808f-66751157cde6. [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] result = getattr(controller, method)(*args, **kwargs) [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._get(image_id) [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] resp, body = self.http_client.get(url, headers=header) [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.request(url, 'GET', **kwargs) [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._handle_response(resp) [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise exc.from_response(resp, resp.content) [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] During handling of the above exception, another exception occurred: [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] yield resources [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self.driver.spawn(context, instance, image_meta, [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1410.324200] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._fetch_image_if_missing(context, vi) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] image_fetch(context, vi, tmp_image_ds_loc) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] images.fetch_image( [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] metadata = IMAGE_API.get(context, image_ref) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return session.show(context, image_id, [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] _reraise_translated_image_exception(image_id) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise new_exc.with_traceback(exc_trace) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] result = getattr(controller, method)(*args, **kwargs) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._get(image_id) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] resp, body = self.http_client.get(url, headers=header) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.request(url, 'GET', **kwargs) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._handle_response(resp) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise exc.from_response(resp, resp.content) [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] nova.exception.ImageNotAuthorized: Not authorized for image b0d1c28b-5c3d-4c47-808f-66751157cde6. [ 1410.325199] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1410.325199] env[68964]: INFO nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Terminating instance [ 1410.326158] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1410.326226] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1410.326863] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4fbcd2ce-b8a6-4011-be57-b1b8e3ab946c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.329070] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1410.329268] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1410.330041] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1d12f1b-dbea-430c-babc-cd9537aa4c24 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.337300] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73daae3f-4a71-460b-a0d1-ba0d68a4d927 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.342574] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1410.342751] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1410.343996] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9773fd6d-efdb-4dc3-a3f8-bc404d25e051 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.372434] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1410.373516] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1354acfe-63ba-45b1-9fba-a97ee457052e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.376021] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b2155d00-2789-4592-9a41-06d3b0b5f595 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.378500] env[68964]: DEBUG oslo_vmware.api [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Waiting for the task: (returnval){ [ 1410.378500] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a110db-c3db-4181-23fa-3e2ad7f523b8" [ 1410.378500] env[68964]: _type = "Task" [ 1410.378500] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1410.384971] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a741818-2546-445e-bb06-7edf2dcbc6b8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.392576] env[68964]: DEBUG oslo_vmware.api [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a110db-c3db-4181-23fa-3e2ad7f523b8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1410.401906] env[68964]: DEBUG nova.compute.provider_tree [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1410.410580] env[68964]: DEBUG nova.scheduler.client.report [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1410.427881] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.342s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1410.428479] env[68964]: ERROR nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1410.428479] env[68964]: Faults: ['InvalidArgument'] [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Traceback (most recent call last): [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] self.driver.spawn(context, instance, image_meta, [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] self._fetch_image_if_missing(context, vi) [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] image_cache(vi, tmp_image_ds_loc) [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] vm_util.copy_virtual_disk( [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] session._wait_for_task(vmdk_copy_task) [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] return self.wait_for_task(task_ref) [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] return evt.wait() [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] result = hub.switch() [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] return self.greenlet.switch() [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] self.f(*self.args, **self.kw) [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] raise exceptions.translate_fault(task_info.error) [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Faults: ['InvalidArgument'] [ 1410.428479] env[68964]: ERROR nova.compute.manager [instance: 4d272615-e2dd-4540-88d0-4a209f559147] [ 1410.429313] env[68964]: DEBUG nova.compute.utils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1410.430732] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Build of instance 4d272615-e2dd-4540-88d0-4a209f559147 was re-scheduled: A specified parameter was not correct: fileType [ 1410.430732] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1410.431099] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1410.431283] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1410.431453] env[68964]: DEBUG nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1410.431610] env[68964]: DEBUG nova.network.neutron [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1410.440028] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1410.440230] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1410.440406] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Deleting the datastore file [datastore1] 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1410.440649] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2f5620c1-e8c0-44ac-a3ec-2748ade8d092 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.449027] env[68964]: DEBUG oslo_vmware.api [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Waiting for the task: (returnval){ [ 1410.449027] env[68964]: value = "task-3431711" [ 1410.449027] env[68964]: _type = "Task" [ 1410.449027] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1410.455491] env[68964]: DEBUG oslo_vmware.api [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Task: {'id': task-3431711, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1410.776442] env[68964]: DEBUG nova.network.neutron [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1410.787883] env[68964]: INFO nova.compute.manager [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Took 0.36 seconds to deallocate network for instance. [ 1410.879843] env[68964]: INFO nova.scheduler.client.report [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Deleted allocations for instance 4d272615-e2dd-4540-88d0-4a209f559147 [ 1410.895852] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1410.895852] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Creating directory with path [datastore1] vmware_temp/7b0f62ce-074e-4308-a957-211cc0e4d882/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1410.895987] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-83f6364c-b015-4612-a620-3bee19cf589f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.906362] env[68964]: DEBUG oslo_concurrency.lockutils [None req-24535110-8b5e-47bb-8d63-76f5e97309f5 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "4d272615-e2dd-4540-88d0-4a209f559147" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 502.880s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1410.907511] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "4d272615-e2dd-4540-88d0-4a209f559147" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 307.287s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1410.907721] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "4d272615-e2dd-4540-88d0-4a209f559147-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1410.907919] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "4d272615-e2dd-4540-88d0-4a209f559147-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1410.908114] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "4d272615-e2dd-4540-88d0-4a209f559147-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1410.911097] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Created directory with path [datastore1] vmware_temp/7b0f62ce-074e-4308-a957-211cc0e4d882/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1410.911292] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Fetch image to [datastore1] vmware_temp/7b0f62ce-074e-4308-a957-211cc0e4d882/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1410.911479] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/7b0f62ce-074e-4308-a957-211cc0e4d882/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1410.911941] env[68964]: INFO nova.compute.manager [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Terminating instance [ 1410.913783] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abee7c22-2010-41e8-9d5c-4bacf3d60834 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.916941] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "refresh_cache-4d272615-e2dd-4540-88d0-4a209f559147" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1410.917124] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquired lock "refresh_cache-4d272615-e2dd-4540-88d0-4a209f559147" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1410.917292] env[68964]: DEBUG nova.network.neutron [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1410.922854] env[68964]: DEBUG nova.compute.manager [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] [instance: c7e9acc0-1427-4382-bcf8-99fdcc08aac0] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1410.925463] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb16c30e-6d59-4fab-9614-d70f41752591 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.938656] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ad355f7-99dd-499c-abd7-5b912621e7b7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.973277] env[68964]: DEBUG nova.network.neutron [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1410.977879] env[68964]: DEBUG nova.compute.manager [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] [instance: c7e9acc0-1427-4382-bcf8-99fdcc08aac0] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1410.979197] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59d1bc99-71a6-41bb-8afe-dbfe2b9cc2ae {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1410.988605] env[68964]: DEBUG oslo_vmware.api [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Task: {'id': task-3431711, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.061285} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1410.990111] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1410.990640] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1410.990640] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1410.990752] env[68964]: INFO nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Took 0.66 seconds to destroy the instance on the hypervisor. [ 1410.992707] env[68964]: DEBUG nova.compute.claims [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1410.992994] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1410.993107] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1410.995855] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cd03bac5-85b2-4d22-b3b3-48d703a68ff2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.008148] env[68964]: DEBUG oslo_concurrency.lockutils [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] Lock "c7e9acc0-1427-4382-bcf8-99fdcc08aac0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.205s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.023294] env[68964]: DEBUG nova.compute.manager [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] [instance: e437a43d-00b7-4feb-ae97-215238cf845b] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1411.027892] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1411.060030] env[68964]: DEBUG nova.compute.manager [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] [instance: e437a43d-00b7-4feb-ae97-215238cf845b] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1411.086942] env[68964]: DEBUG oslo_vmware.rw_handles [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b0f62ce-074e-4308-a957-211cc0e4d882/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1411.089151] env[68964]: DEBUG oslo_concurrency.lockutils [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] Lock "e437a43d-00b7-4feb-ae97-215238cf845b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.254s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.148319] env[68964]: DEBUG nova.compute.manager [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] [instance: 43a9c974-399f-44cb-b836-4bdf17a8d768] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1411.152594] env[68964]: DEBUG nova.network.neutron [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1411.154374] env[68964]: DEBUG oslo_vmware.rw_handles [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1411.154374] env[68964]: DEBUG oslo_vmware.rw_handles [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b0f62ce-074e-4308-a957-211cc0e4d882/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1411.166126] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Releasing lock "refresh_cache-4d272615-e2dd-4540-88d0-4a209f559147" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1411.166126] env[68964]: DEBUG nova.compute.manager [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1411.166126] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1411.166707] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ec1f880e-58d1-40c3-ba1d-47ba1cc42c37 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.176138] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20f4e572-ff17-489d-946d-5a0206ef13f5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.189587] env[68964]: DEBUG nova.compute.manager [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] [instance: 43a9c974-399f-44cb-b836-4bdf17a8d768] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1411.208341] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4d272615-e2dd-4540-88d0-4a209f559147 could not be found. [ 1411.208552] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1411.208730] env[68964]: INFO nova.compute.manager [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1411.208973] env[68964]: DEBUG oslo.service.loopingcall [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1411.213067] env[68964]: DEBUG nova.compute.manager [-] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1411.213186] env[68964]: DEBUG nova.network.neutron [-] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1411.222765] env[68964]: DEBUG oslo_concurrency.lockutils [None req-40eee934-f6a2-433a-abce-12fd56aa8be5 tempest-ListServersNegativeTestJSON-1408168082 tempest-ListServersNegativeTestJSON-1408168082-project-member] Lock "43a9c974-399f-44cb-b836-4bdf17a8d768" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.348s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.233093] env[68964]: DEBUG nova.compute.manager [None req-8a0746bc-aa56-4128-9459-363325689938 tempest-ServerShowV254Test-294334972 tempest-ServerShowV254Test-294334972-project-member] [instance: 530b7cf0-53d3-4bfb-b545-b0bb57dc91b7] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1411.239275] env[68964]: DEBUG nova.network.neutron [-] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1411.246926] env[68964]: DEBUG nova.network.neutron [-] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1411.261661] env[68964]: INFO nova.compute.manager [-] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] Took 0.05 seconds to deallocate network for instance. [ 1411.269181] env[68964]: DEBUG nova.compute.manager [None req-8a0746bc-aa56-4128-9459-363325689938 tempest-ServerShowV254Test-294334972 tempest-ServerShowV254Test-294334972-project-member] [instance: 530b7cf0-53d3-4bfb-b545-b0bb57dc91b7] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1411.294555] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8a0746bc-aa56-4128-9459-363325689938 tempest-ServerShowV254Test-294334972 tempest-ServerShowV254Test-294334972-project-member] Lock "530b7cf0-53d3-4bfb-b545-b0bb57dc91b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.377s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.308916] env[68964]: DEBUG nova.compute.manager [None req-d147c4cc-1408-4300-b375-73a5bd6617e1 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] [instance: db7c702e-3d49-4eb7-9d7e-c715186f1f78] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1411.317434] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65354ae2-9610-4e71-9b36-13c7f4caec21 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.326539] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6a6122d-0848-49b3-9fc0-65c36e91698f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.357712] env[68964]: DEBUG nova.compute.manager [None req-d147c4cc-1408-4300-b375-73a5bd6617e1 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] [instance: db7c702e-3d49-4eb7-9d7e-c715186f1f78] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1411.359164] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2433a368-0914-4bd0-8175-ef97313ae0cc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.367037] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37499b20-2ba6-4224-971c-81b7d764b9b3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.375590] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7c24a011-3de8-459c-80c5-6a91cb8148c7 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "4d272615-e2dd-4540-88d0-4a209f559147" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.468s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.377255] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "4d272615-e2dd-4540-88d0-4a209f559147" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 163.756s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1411.377364] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 4d272615-e2dd-4540-88d0-4a209f559147] During sync_power_state the instance has a pending task (deleting). Skip. [ 1411.377541] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "4d272615-e2dd-4540-88d0-4a209f559147" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.386680] env[68964]: DEBUG nova.compute.provider_tree [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1411.388363] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d147c4cc-1408-4300-b375-73a5bd6617e1 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] Lock "db7c702e-3d49-4eb7-9d7e-c715186f1f78" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.709s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.394757] env[68964]: DEBUG nova.scheduler.client.report [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1411.397945] env[68964]: DEBUG nova.compute.manager [None req-b226d23b-1a38-4898-8155-8dced1791fc6 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] [instance: 178b3fc0-3f93-400f-ba19-d9703c62fd22] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1411.408479] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.415s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.409262] env[68964]: ERROR nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image b0d1c28b-5c3d-4c47-808f-66751157cde6. [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] result = getattr(controller, method)(*args, **kwargs) [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._get(image_id) [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] resp, body = self.http_client.get(url, headers=header) [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.request(url, 'GET', **kwargs) [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._handle_response(resp) [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise exc.from_response(resp, resp.content) [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] During handling of the above exception, another exception occurred: [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self.driver.spawn(context, instance, image_meta, [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._fetch_image_if_missing(context, vi) [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1411.409262] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] image_fetch(context, vi, tmp_image_ds_loc) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] images.fetch_image( [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] metadata = IMAGE_API.get(context, image_ref) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return session.show(context, image_id, [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] _reraise_translated_image_exception(image_id) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise new_exc.with_traceback(exc_trace) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] result = getattr(controller, method)(*args, **kwargs) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._get(image_id) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] resp, body = self.http_client.get(url, headers=header) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.request(url, 'GET', **kwargs) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._handle_response(resp) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise exc.from_response(resp, resp.content) [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] nova.exception.ImageNotAuthorized: Not authorized for image b0d1c28b-5c3d-4c47-808f-66751157cde6. [ 1411.409987] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.409987] env[68964]: DEBUG nova.compute.utils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Not authorized for image b0d1c28b-5c3d-4c47-808f-66751157cde6. {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1411.411422] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Build of instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 was re-scheduled: Not authorized for image b0d1c28b-5c3d-4c47-808f-66751157cde6. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1411.411922] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1411.412301] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1411.412374] env[68964]: DEBUG nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1411.412514] env[68964]: DEBUG nova.network.neutron [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1411.421435] env[68964]: DEBUG nova.compute.manager [None req-b226d23b-1a38-4898-8155-8dced1791fc6 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] [instance: 178b3fc0-3f93-400f-ba19-d9703c62fd22] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1411.443801] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b226d23b-1a38-4898-8155-8dced1791fc6 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] Lock "178b3fc0-3f93-400f-ba19-d9703c62fd22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.346s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.454447] env[68964]: DEBUG nova.compute.manager [None req-9811f56a-6f83-4d8f-9e41-b6071e3cb4c7 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] [instance: 7f64ec65-fc61-4c44-a489-a36b9f8750e2] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1411.479692] env[68964]: DEBUG nova.compute.manager [None req-9811f56a-6f83-4d8f-9e41-b6071e3cb4c7 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] [instance: 7f64ec65-fc61-4c44-a489-a36b9f8750e2] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1411.500172] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9811f56a-6f83-4d8f-9e41-b6071e3cb4c7 tempest-ListServerFiltersTestJSON-340682376 tempest-ListServerFiltersTestJSON-340682376-project-member] Lock "7f64ec65-fc61-4c44-a489-a36b9f8750e2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.024s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.509284] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1411.551256] env[68964]: DEBUG neutronclient.v2_0.client [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68964) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1411.553556] env[68964]: ERROR nova.compute.manager [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] result = getattr(controller, method)(*args, **kwargs) [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._get(image_id) [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] resp, body = self.http_client.get(url, headers=header) [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.request(url, 'GET', **kwargs) [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._handle_response(resp) [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise exc.from_response(resp, resp.content) [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] During handling of the above exception, another exception occurred: [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self.driver.spawn(context, instance, image_meta, [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._fetch_image_if_missing(context, vi) [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1411.553556] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] image_fetch(context, vi, tmp_image_ds_loc) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] images.fetch_image( [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] metadata = IMAGE_API.get(context, image_ref) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return session.show(context, image_id, [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] _reraise_translated_image_exception(image_id) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise new_exc.with_traceback(exc_trace) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] result = getattr(controller, method)(*args, **kwargs) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._get(image_id) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] resp, body = self.http_client.get(url, headers=header) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.request(url, 'GET', **kwargs) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self._handle_response(resp) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise exc.from_response(resp, resp.content) [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] nova.exception.ImageNotAuthorized: Not authorized for image b0d1c28b-5c3d-4c47-808f-66751157cde6. [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] During handling of the above exception, another exception occurred: [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._build_and_run_instance(context, instance, image, [ 1411.554376] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise exception.RescheduledException( [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] nova.exception.RescheduledException: Build of instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 was re-scheduled: Not authorized for image b0d1c28b-5c3d-4c47-808f-66751157cde6. [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] During handling of the above exception, another exception occurred: [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] exception_handler_v20(status_code, error_body) [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise client_exc(message=error_message, [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Neutron server returns request_ids: ['req-a9022ff1-3c48-4885-9f63-5ac30098a8b6'] [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] During handling of the above exception, another exception occurred: [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._deallocate_network(context, instance, requested_networks) [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self.network_api.deallocate_for_instance( [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] data = neutron.list_ports(**search_opts) [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.list('ports', self.ports_path, retrieve_all, [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] for r in self._pagination(collection, path, **params): [ 1411.555301] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] res = self.get(path, params=params) [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.retry_request("GET", action, body=body, [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.do_request(method, action, body=body, [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._handle_fault_response(status_code, replybody, resp) [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise exception.Unauthorized() [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] nova.exception.Unauthorized: Not authorized. [ 1411.556281] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1411.560245] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1411.560477] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1411.561887] env[68964]: INFO nova.compute.claims [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1411.612782] env[68964]: INFO nova.scheduler.client.report [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Deleted allocations for instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 [ 1411.628786] env[68964]: DEBUG oslo_concurrency.lockutils [None req-cf50d245-a396-4122-a4c5-3fb051eb1508 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 494.895s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.630183] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 297.980s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1411.630396] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1411.630597] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1411.630758] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.632635] env[68964]: INFO nova.compute.manager [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Terminating instance [ 1411.637017] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquiring lock "refresh_cache-94ca6313-24cc-40cb-ac20-9e9c8205a9d8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1411.637017] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Acquired lock "refresh_cache-94ca6313-24cc-40cb-ac20-9e9c8205a9d8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1411.637017] env[68964]: DEBUG nova.network.neutron [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1411.658570] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1411.711336] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1411.788123] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05497d93-0deb-4978-b00e-34e2d832668c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.800272] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-256cacbb-d18b-4489-afbd-bad7198dd143 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.836973] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e68026b-f124-4a1b-bfa6-49ddfa39f9bf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.845460] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ff37128-c110-4192-8930-459ec44b260e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1411.860726] env[68964]: DEBUG nova.compute.provider_tree [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1411.869578] env[68964]: DEBUG nova.scheduler.client.report [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1411.885910] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.325s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1411.886379] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1411.888769] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.178s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1411.890255] env[68964]: INFO nova.compute.claims [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1411.920142] env[68964]: DEBUG nova.compute.utils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1411.921318] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1411.922736] env[68964]: DEBUG nova.network.neutron [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1411.936952] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1412.014651] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1412.020654] env[68964]: DEBUG nova.policy [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a850f3e7307468d9e739fda0ce4fdb3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ed3fb39ffe124bbaae0b10d818a90c2f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1412.022867] env[68964]: DEBUG nova.network.neutron [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Updating instance_info_cache with network_info: [{"id": "f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d", "address": "fa:16:3e:a4:e7:ca", "network": {"id": "95b5d2ca-650f-464b-8a37-1f13ca25def0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.31", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "5759c8ac0b114e32b09097edb04a3e9b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f9c4edd5-d88e-4996-afea-00130ace0dad", "external-id": "nsx-vlan-transportzone-261", "segmentation_id": 261, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf9c3ede0-f9", "ovs_interfaceid": "f9c3ede0-f9bf-489f-b0c4-eebe8ecbcc5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1412.035148] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Releasing lock "refresh_cache-94ca6313-24cc-40cb-ac20-9e9c8205a9d8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1412.035148] env[68964]: DEBUG nova.compute.manager [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1412.035148] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1412.035148] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-62683cee-5520-4c02-b999-bbfa562796ab {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.046013] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1412.046346] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1412.046513] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1412.046694] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1412.046838] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1412.046982] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1412.047207] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1412.047364] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1412.047526] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1412.047683] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1412.047846] env[68964]: DEBUG nova.virt.hardware [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1412.050687] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2866ee2-1804-4225-9e63-f037f3206556 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.061658] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e02a6f6-2dde-4140-aea1-bdf521ac8034 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.077600] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8e60af5-e533-4eda-acb3-a564bda647c9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.092535] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 94ca6313-24cc-40cb-ac20-9e9c8205a9d8 could not be found. [ 1412.092756] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1412.092986] env[68964]: INFO nova.compute.manager [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Took 0.06 seconds to destroy the instance on the hypervisor. [ 1412.093247] env[68964]: DEBUG oslo.service.loopingcall [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1412.094096] env[68964]: DEBUG nova.compute.manager [-] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1412.094202] env[68964]: DEBUG nova.network.neutron [-] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1412.137798] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cf2a384-fe9e-422c-9634-07cecf9ed952 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.144645] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a12d8a9-8e8f-486f-a598-86d95fbc7da6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.178087] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-784a7550-3f17-4e8f-95ed-522fac91d8c0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.185653] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c90dd72-ef0d-4020-a46b-8b488f70e97f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.199159] env[68964]: DEBUG nova.compute.provider_tree [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1412.209229] env[68964]: DEBUG nova.scheduler.client.report [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1412.223024] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.223166] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1412.259885] env[68964]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68964) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1412.260232] env[68964]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-bd78e006-480f-45a2-a97e-2fa208d13067'] [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1412.261102] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1412.262873] env[68964]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1412.262873] env[68964]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1412.262873] env[68964]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1412.262873] env[68964]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1412.262873] env[68964]: ERROR oslo.service.loopingcall [ 1412.262873] env[68964]: ERROR nova.compute.manager [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1412.281784] env[68964]: DEBUG nova.compute.utils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1412.283225] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1412.283401] env[68964]: DEBUG nova.network.neutron [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1412.295834] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1412.320697] env[68964]: ERROR nova.compute.manager [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] exception_handler_v20(status_code, error_body) [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise client_exc(message=error_message, [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Neutron server returns request_ids: ['req-bd78e006-480f-45a2-a97e-2fa208d13067'] [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] During handling of the above exception, another exception occurred: [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Traceback (most recent call last): [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._delete_instance(context, instance, bdms) [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._shutdown_instance(context, instance, bdms) [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._try_deallocate_network(context, instance, requested_networks) [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] with excutils.save_and_reraise_exception(): [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self.force_reraise() [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise self.value [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] _deallocate_network_with_retries() [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return evt.wait() [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1412.320697] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] result = hub.switch() [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.greenlet.switch() [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] result = func(*self.args, **self.kw) [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] result = f(*args, **kwargs) [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._deallocate_network( [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self.network_api.deallocate_for_instance( [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] data = neutron.list_ports(**search_opts) [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.list('ports', self.ports_path, retrieve_all, [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] for r in self._pagination(collection, path, **params): [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] res = self.get(path, params=params) [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.retry_request("GET", action, body=body, [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] return self.do_request(method, action, body=body, [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] ret = obj(*args, **kwargs) [ 1412.321597] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1412.322352] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] self._handle_fault_response(status_code, replybody, resp) [ 1412.322352] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1412.322352] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1412.322352] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1412.322352] env[68964]: ERROR nova.compute.manager [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] [ 1412.364521] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.734s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.365639] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 164.744s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1412.365859] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] During sync_power_state the instance has a pending task (deleting). Skip. [ 1412.366243] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "94ca6313-24cc-40cb-ac20-9e9c8205a9d8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1412.407675] env[68964]: DEBUG nova.policy [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6c886bb48168465fa0285f39f206d17e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76203a6bfe0545cd986059cc74b0e614', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1412.412034] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1412.447895] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1412.448179] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1412.448357] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1412.448540] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1412.448687] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1412.448848] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1412.449087] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1412.449254] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1412.449425] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1412.449634] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1412.449748] env[68964]: DEBUG nova.virt.hardware [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1412.450885] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12fd8a84-0f7c-4e79-9670-e9821f46878c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.455133] env[68964]: INFO nova.compute.manager [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] [instance: 94ca6313-24cc-40cb-ac20-9e9c8205a9d8] Successfully reverted task state from None on failure for instance. [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server [None req-5373e14f-f0f6-49f5-8274-559d78b3d779 tempest-DeleteServersAdminTestJSON-18643757 tempest-DeleteServersAdminTestJSON-18643757-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-bd78e006-480f-45a2-a97e-2fa208d13067'] [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server raise self.value [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server raise self.value [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1412.459834] env[68964]: ERROR oslo_messaging.rpc.server raise self.value [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server raise self.value [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server raise self.value [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1412.461033] env[68964]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1412.462110] env[68964]: ERROR oslo_messaging.rpc.server [ 1412.463660] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b007835-1294-4cd9-982f-87ced0285187 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1412.642277] env[68964]: DEBUG nova.network.neutron [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Successfully created port: 715f48fd-e5e2-4f8a-ae89-4ad70343e5b5 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1413.300030] env[68964]: DEBUG nova.network.neutron [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Successfully created port: 8d9bdaab-cdcc-4383-9746-1882ef4668d7 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1413.800937] env[68964]: DEBUG nova.compute.manager [req-0249bd53-ce41-4430-97c9-964795cd8235 req-5469491b-97b5-4d1f-a0e2-b2c3659b916d service nova] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Received event network-vif-plugged-715f48fd-e5e2-4f8a-ae89-4ad70343e5b5 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1413.801184] env[68964]: DEBUG oslo_concurrency.lockutils [req-0249bd53-ce41-4430-97c9-964795cd8235 req-5469491b-97b5-4d1f-a0e2-b2c3659b916d service nova] Acquiring lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1413.801392] env[68964]: DEBUG oslo_concurrency.lockutils [req-0249bd53-ce41-4430-97c9-964795cd8235 req-5469491b-97b5-4d1f-a0e2-b2c3659b916d service nova] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1413.801584] env[68964]: DEBUG oslo_concurrency.lockutils [req-0249bd53-ce41-4430-97c9-964795cd8235 req-5469491b-97b5-4d1f-a0e2-b2c3659b916d service nova] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1413.801707] env[68964]: DEBUG nova.compute.manager [req-0249bd53-ce41-4430-97c9-964795cd8235 req-5469491b-97b5-4d1f-a0e2-b2c3659b916d service nova] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] No waiting events found dispatching network-vif-plugged-715f48fd-e5e2-4f8a-ae89-4ad70343e5b5 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1413.801871] env[68964]: WARNING nova.compute.manager [req-0249bd53-ce41-4430-97c9-964795cd8235 req-5469491b-97b5-4d1f-a0e2-b2c3659b916d service nova] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Received unexpected event network-vif-plugged-715f48fd-e5e2-4f8a-ae89-4ad70343e5b5 for instance with vm_state building and task_state spawning. [ 1413.893809] env[68964]: DEBUG nova.network.neutron [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Successfully updated port: 715f48fd-e5e2-4f8a-ae89-4ad70343e5b5 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1413.909688] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "refresh_cache-b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1413.909837] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquired lock "refresh_cache-b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1413.909990] env[68964]: DEBUG nova.network.neutron [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1413.970540] env[68964]: DEBUG nova.network.neutron [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1414.347497] env[68964]: DEBUG nova.network.neutron [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Updating instance_info_cache with network_info: [{"id": "715f48fd-e5e2-4f8a-ae89-4ad70343e5b5", "address": "fa:16:3e:7b:fc:d8", "network": {"id": "8269932d-431e-40c7-a163-4ac9eb02d711", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2129165268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ed3fb39ffe124bbaae0b10d818a90c2f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "99be9a5e-b3f9-4e6c-83d5-df11f817847d", "external-id": "nsx-vlan-transportzone-566", "segmentation_id": 566, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap715f48fd-e5", "ovs_interfaceid": "715f48fd-e5e2-4f8a-ae89-4ad70343e5b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1414.358643] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Releasing lock "refresh_cache-b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1414.360256] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Instance network_info: |[{"id": "715f48fd-e5e2-4f8a-ae89-4ad70343e5b5", "address": "fa:16:3e:7b:fc:d8", "network": {"id": "8269932d-431e-40c7-a163-4ac9eb02d711", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2129165268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ed3fb39ffe124bbaae0b10d818a90c2f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "99be9a5e-b3f9-4e6c-83d5-df11f817847d", "external-id": "nsx-vlan-transportzone-566", "segmentation_id": 566, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap715f48fd-e5", "ovs_interfaceid": "715f48fd-e5e2-4f8a-ae89-4ad70343e5b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1414.360256] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7b:fc:d8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '99be9a5e-b3f9-4e6c-83d5-df11f817847d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '715f48fd-e5e2-4f8a-ae89-4ad70343e5b5', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1414.367026] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Creating folder: Project (ed3fb39ffe124bbaae0b10d818a90c2f). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1414.367936] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b5fabb94-72b1-43da-adf8-8d2ca203a4e9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.379418] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Created folder: Project (ed3fb39ffe124bbaae0b10d818a90c2f) in parent group-v684465. [ 1414.379418] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Creating folder: Instances. Parent ref: group-v684578. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1414.379418] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4200e735-20e4-4ae1-b69a-2b6359ee7150 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.386355] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Created folder: Instances in parent group-v684578. [ 1414.386588] env[68964]: DEBUG oslo.service.loopingcall [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1414.386786] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1414.386968] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e1e309f1-cfa5-432d-bce0-01c0ad9ad356 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.406084] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1414.406084] env[68964]: value = "task-3431714" [ 1414.406084] env[68964]: _type = "Task" [ 1414.406084] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.413707] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431714, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1414.414490] env[68964]: DEBUG nova.network.neutron [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Successfully updated port: 8d9bdaab-cdcc-4383-9746-1882ef4668d7 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1414.422409] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquiring lock "refresh_cache-3d41d454-f370-46a6-ba97-17f5553d557c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1414.422523] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquired lock "refresh_cache-3d41d454-f370-46a6-ba97-17f5553d557c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1414.422690] env[68964]: DEBUG nova.network.neutron [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1414.467501] env[68964]: DEBUG nova.network.neutron [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1414.667015] env[68964]: DEBUG nova.network.neutron [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Updating instance_info_cache with network_info: [{"id": "8d9bdaab-cdcc-4383-9746-1882ef4668d7", "address": "fa:16:3e:b8:35:b2", "network": {"id": "fa346b86-39d2-442a-98e4-0225e405b7d8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1674116479-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "76203a6bfe0545cd986059cc74b0e614", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46e1fc20-2067-4e1a-9812-702772a2c82c", "external-id": "nsx-vlan-transportzone-210", "segmentation_id": 210, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8d9bdaab-cd", "ovs_interfaceid": "8d9bdaab-cdcc-4383-9746-1882ef4668d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1414.681947] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Releasing lock "refresh_cache-3d41d454-f370-46a6-ba97-17f5553d557c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1414.681947] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Instance network_info: |[{"id": "8d9bdaab-cdcc-4383-9746-1882ef4668d7", "address": "fa:16:3e:b8:35:b2", "network": {"id": "fa346b86-39d2-442a-98e4-0225e405b7d8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1674116479-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "76203a6bfe0545cd986059cc74b0e614", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46e1fc20-2067-4e1a-9812-702772a2c82c", "external-id": "nsx-vlan-transportzone-210", "segmentation_id": 210, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8d9bdaab-cd", "ovs_interfaceid": "8d9bdaab-cdcc-4383-9746-1882ef4668d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1414.682279] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b8:35:b2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '46e1fc20-2067-4e1a-9812-702772a2c82c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8d9bdaab-cdcc-4383-9746-1882ef4668d7', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1414.689935] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Creating folder: Project (76203a6bfe0545cd986059cc74b0e614). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1414.690508] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bb55ef49-b742-4bf6-95d4-bcddf11dc3cd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.701910] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Created folder: Project (76203a6bfe0545cd986059cc74b0e614) in parent group-v684465. [ 1414.702117] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Creating folder: Instances. Parent ref: group-v684581. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1414.702407] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7c042190-9eb6-4ff2-881f-f7f3cf497ac7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.711743] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Created folder: Instances in parent group-v684581. [ 1414.711968] env[68964]: DEBUG oslo.service.loopingcall [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1414.712162] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1414.712354] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aa96d673-ee0a-4dbb-893a-1dc06fabe5e1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.732783] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1414.732783] env[68964]: value = "task-3431717" [ 1414.732783] env[68964]: _type = "Task" [ 1414.732783] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.739871] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431717, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1414.916043] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431714, 'name': CreateVM_Task} progress is 25%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1415.243937] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431717, 'name': CreateVM_Task, 'duration_secs': 0.321579} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1415.244102] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1415.244798] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1415.244951] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1415.245285] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1415.245540] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a4d391c1-63bf-492a-9deb-53e5165a2380 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.249954] env[68964]: DEBUG oslo_vmware.api [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Waiting for the task: (returnval){ [ 1415.249954] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5295f958-153a-ce22-ad26-91eddf692d71" [ 1415.249954] env[68964]: _type = "Task" [ 1415.249954] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1415.258963] env[68964]: DEBUG oslo_vmware.api [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5295f958-153a-ce22-ad26-91eddf692d71, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1415.416217] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431714, 'name': CreateVM_Task, 'duration_secs': 0.69699} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1415.416559] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1415.417086] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1415.761520] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1415.761655] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1415.761800] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1415.761990] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1415.762303] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1415.762555] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0a2e71a2-103e-486f-a005-88e5068e68cb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.767266] env[68964]: DEBUG oslo_vmware.api [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for the task: (returnval){ [ 1415.767266] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525c9d9e-4295-5c63-929d-ac8f61698265" [ 1415.767266] env[68964]: _type = "Task" [ 1415.767266] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1415.774958] env[68964]: DEBUG oslo_vmware.api [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525c9d9e-4295-5c63-929d-ac8f61698265, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1415.833042] env[68964]: DEBUG nova.compute.manager [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Received event network-changed-715f48fd-e5e2-4f8a-ae89-4ad70343e5b5 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1415.833214] env[68964]: DEBUG nova.compute.manager [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Refreshing instance network info cache due to event network-changed-715f48fd-e5e2-4f8a-ae89-4ad70343e5b5. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1415.833426] env[68964]: DEBUG oslo_concurrency.lockutils [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] Acquiring lock "refresh_cache-b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1415.833566] env[68964]: DEBUG oslo_concurrency.lockutils [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] Acquired lock "refresh_cache-b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1415.833725] env[68964]: DEBUG nova.network.neutron [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Refreshing network info cache for port 715f48fd-e5e2-4f8a-ae89-4ad70343e5b5 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1416.065251] env[68964]: DEBUG nova.network.neutron [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Updated VIF entry in instance network info cache for port 715f48fd-e5e2-4f8a-ae89-4ad70343e5b5. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1416.065592] env[68964]: DEBUG nova.network.neutron [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Updating instance_info_cache with network_info: [{"id": "715f48fd-e5e2-4f8a-ae89-4ad70343e5b5", "address": "fa:16:3e:7b:fc:d8", "network": {"id": "8269932d-431e-40c7-a163-4ac9eb02d711", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2129165268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ed3fb39ffe124bbaae0b10d818a90c2f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "99be9a5e-b3f9-4e6c-83d5-df11f817847d", "external-id": "nsx-vlan-transportzone-566", "segmentation_id": 566, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap715f48fd-e5", "ovs_interfaceid": "715f48fd-e5e2-4f8a-ae89-4ad70343e5b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1416.074560] env[68964]: DEBUG oslo_concurrency.lockutils [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] Releasing lock "refresh_cache-b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1416.074806] env[68964]: DEBUG nova.compute.manager [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Received event network-vif-plugged-8d9bdaab-cdcc-4383-9746-1882ef4668d7 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1416.074999] env[68964]: DEBUG oslo_concurrency.lockutils [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] Acquiring lock "3d41d454-f370-46a6-ba97-17f5553d557c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1416.075213] env[68964]: DEBUG oslo_concurrency.lockutils [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] Lock "3d41d454-f370-46a6-ba97-17f5553d557c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.075372] env[68964]: DEBUG oslo_concurrency.lockutils [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] Lock "3d41d454-f370-46a6-ba97-17f5553d557c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.075531] env[68964]: DEBUG nova.compute.manager [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] No waiting events found dispatching network-vif-plugged-8d9bdaab-cdcc-4383-9746-1882ef4668d7 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1416.075690] env[68964]: WARNING nova.compute.manager [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Received unexpected event network-vif-plugged-8d9bdaab-cdcc-4383-9746-1882ef4668d7 for instance with vm_state building and task_state spawning. [ 1416.075869] env[68964]: DEBUG nova.compute.manager [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Received event network-changed-8d9bdaab-cdcc-4383-9746-1882ef4668d7 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1416.076045] env[68964]: DEBUG nova.compute.manager [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Refreshing instance network info cache due to event network-changed-8d9bdaab-cdcc-4383-9746-1882ef4668d7. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1416.076231] env[68964]: DEBUG oslo_concurrency.lockutils [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] Acquiring lock "refresh_cache-3d41d454-f370-46a6-ba97-17f5553d557c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1416.076365] env[68964]: DEBUG oslo_concurrency.lockutils [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] Acquired lock "refresh_cache-3d41d454-f370-46a6-ba97-17f5553d557c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1416.076512] env[68964]: DEBUG nova.network.neutron [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Refreshing network info cache for port 8d9bdaab-cdcc-4383-9746-1882ef4668d7 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1416.280642] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1416.280642] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1416.280642] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1416.315294] env[68964]: DEBUG nova.network.neutron [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Updated VIF entry in instance network info cache for port 8d9bdaab-cdcc-4383-9746-1882ef4668d7. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1416.315613] env[68964]: DEBUG nova.network.neutron [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Updating instance_info_cache with network_info: [{"id": "8d9bdaab-cdcc-4383-9746-1882ef4668d7", "address": "fa:16:3e:b8:35:b2", "network": {"id": "fa346b86-39d2-442a-98e4-0225e405b7d8", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1674116479-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "76203a6bfe0545cd986059cc74b0e614", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46e1fc20-2067-4e1a-9812-702772a2c82c", "external-id": "nsx-vlan-transportzone-210", "segmentation_id": 210, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8d9bdaab-cd", "ovs_interfaceid": "8d9bdaab-cdcc-4383-9746-1882ef4668d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1416.326035] env[68964]: DEBUG oslo_concurrency.lockutils [req-d5a26bb0-42df-4703-a68e-e5fbefe1dc8b req-a5c95ce0-47f3-48fe-a620-d0df9d858817 service nova] Releasing lock "refresh_cache-3d41d454-f370-46a6-ba97-17f5553d557c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1416.836315] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1448.362596] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1448.724706] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1449.724606] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1450.720049] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1450.723640] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1450.723836] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1450.723949] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1450.747874] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.748165] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.748200] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.748324] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 704ec14b-410e-4175-b032-69074b332d87] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.748449] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.748569] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.748688] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.748803] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.748918] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.749043] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1450.749163] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1450.749661] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1451.724229] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1451.745382] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1451.745564] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1452.725137] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1452.737165] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1452.737381] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1452.737547] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1452.737703] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1452.738812] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-162c97b4-b989-4387-b73f-e9c009056b92 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.747927] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b93d4330-b540-44f1-9b32-820a93cf3925 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.761699] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e2cb364-a56d-4ac3-aa42-65ababd2fd59 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.767956] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d0ab402-0012-4644-814a-fba29cb4ea26 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1452.796798] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180923MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1452.796944] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1452.797151] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1452.865984] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 244140d1-bf22-415a-b770-05f2fe106149 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.866167] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96c1b70b-9a17-46b1-999d-558b85c77d22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.866296] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.866444] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 704ec14b-410e-4175-b032-69074b332d87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.866577] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.866696] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.866807] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.866918] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.867042] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.867159] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3d41d454-f370-46a6-ba97-17f5553d557c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1452.877975] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1452.890345] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance fbf98cba-22ba-4ad6-8d97-59bbbbf56e90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1452.900036] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1452.900259] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1452.900405] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1453.045559] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbc3ee2b-f149-4229-a5ea-ab9e31536d17 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1453.053326] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-975236e3-5508-42f2-8c55-2642393ef77e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1453.082480] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0006af3-f051-415b-a6a2-f529a2e19be2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1453.089260] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1fd20ad-6ee3-47d1-a501-77924d64786b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1453.101888] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1453.109985] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1453.123103] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1453.123280] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.326s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1454.123324] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1457.667028] env[68964]: WARNING oslo_vmware.rw_handles [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1457.667028] env[68964]: ERROR oslo_vmware.rw_handles [ 1457.667835] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/7b0f62ce-074e-4308-a957-211cc0e4d882/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1457.669771] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1457.670027] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Copying Virtual Disk [datastore1] vmware_temp/7b0f62ce-074e-4308-a957-211cc0e4d882/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/7b0f62ce-074e-4308-a957-211cc0e4d882/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1457.670307] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b495a259-1fd2-4960-b69b-4451e00b1000 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1457.679118] env[68964]: DEBUG oslo_vmware.api [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Waiting for the task: (returnval){ [ 1457.679118] env[68964]: value = "task-3431718" [ 1457.679118] env[68964]: _type = "Task" [ 1457.679118] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1457.687384] env[68964]: DEBUG oslo_vmware.api [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Task: {'id': task-3431718, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1458.189646] env[68964]: DEBUG oslo_vmware.exceptions [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1458.189928] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1458.190497] env[68964]: ERROR nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1458.190497] env[68964]: Faults: ['InvalidArgument'] [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] Traceback (most recent call last): [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] yield resources [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] self.driver.spawn(context, instance, image_meta, [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] self._fetch_image_if_missing(context, vi) [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] image_cache(vi, tmp_image_ds_loc) [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] vm_util.copy_virtual_disk( [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] session._wait_for_task(vmdk_copy_task) [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] return self.wait_for_task(task_ref) [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] return evt.wait() [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] result = hub.switch() [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] return self.greenlet.switch() [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] self.f(*self.args, **self.kw) [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] raise exceptions.translate_fault(task_info.error) [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] Faults: ['InvalidArgument'] [ 1458.190497] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] [ 1458.191523] env[68964]: INFO nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Terminating instance [ 1458.192330] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1458.192537] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1458.192773] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5808e256-c788-4efe-9760-7551f14041a0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.194901] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1458.195106] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1458.195824] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20aabb22-ea81-47c0-9a4b-b8db1f79e068 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.203461] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1458.203669] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c2d0eac9-6f0f-44fb-b60f-3f8b2b69ced2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.205800] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1458.205966] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1458.206895] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-82b08a21-de2d-442a-97e8-a68f5b53ed8d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.211568] env[68964]: DEBUG oslo_vmware.api [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Waiting for the task: (returnval){ [ 1458.211568] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52880172-ecbe-e264-1628-12fbfae1de37" [ 1458.211568] env[68964]: _type = "Task" [ 1458.211568] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1458.223587] env[68964]: DEBUG oslo_vmware.api [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52880172-ecbe-e264-1628-12fbfae1de37, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1458.266430] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1458.266780] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1458.266997] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Deleting the datastore file [datastore1] 244140d1-bf22-415a-b770-05f2fe106149 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1458.267301] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cbd9796e-4c60-413e-ba63-7ba7a386839f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.273809] env[68964]: DEBUG oslo_vmware.api [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Waiting for the task: (returnval){ [ 1458.273809] env[68964]: value = "task-3431720" [ 1458.273809] env[68964]: _type = "Task" [ 1458.273809] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1458.281297] env[68964]: DEBUG oslo_vmware.api [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Task: {'id': task-3431720, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1458.722254] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1458.722551] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Creating directory with path [datastore1] vmware_temp/789012f4-1774-4403-abf0-8a9c1bdecab2/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1458.722726] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-40d79362-0df6-481f-aa8a-adc2ca9d185a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.733744] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Created directory with path [datastore1] vmware_temp/789012f4-1774-4403-abf0-8a9c1bdecab2/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1458.733923] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Fetch image to [datastore1] vmware_temp/789012f4-1774-4403-abf0-8a9c1bdecab2/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1458.734103] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/789012f4-1774-4403-abf0-8a9c1bdecab2/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1458.734798] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46208525-bdcd-4961-b4db-d2b4cc684805 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.740887] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96a9f076-fd53-4ac6-b7ed-77b3b819034b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.749432] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a0e2431-a6f1-47d9-9754-3d38d45e4032 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.782950] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f7a9089-94f8-4439-a7ca-98d8a55216a7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.788836] env[68964]: DEBUG oslo_vmware.api [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Task: {'id': task-3431720, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067563} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1458.790236] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1458.790429] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1458.790596] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1458.790766] env[68964]: INFO nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1458.792466] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-933c5ece-bac9-43b0-aad8-0d627a40acb8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1458.794333] env[68964]: DEBUG nova.compute.claims [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1458.794503] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1458.796029] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1458.818697] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1458.873666] env[68964]: DEBUG oslo_vmware.rw_handles [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/789012f4-1774-4403-abf0-8a9c1bdecab2/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1458.933651] env[68964]: DEBUG oslo_vmware.rw_handles [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1458.933858] env[68964]: DEBUG oslo_vmware.rw_handles [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/789012f4-1774-4403-abf0-8a9c1bdecab2/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1459.049086] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54a944b2-ab55-46b7-984b-9259537920e0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.060791] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc24b53f-f6b1-4265-856e-711750287b94 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.096574] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd04c345-ba4b-45b5-a907-b93863244be7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.103666] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-393fc64e-bd04-4306-bf15-a89999d205e4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.116846] env[68964]: DEBUG nova.compute.provider_tree [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1459.126081] env[68964]: DEBUG nova.scheduler.client.report [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1459.147832] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.350s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.147832] env[68964]: ERROR nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1459.147832] env[68964]: Faults: ['InvalidArgument'] [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] Traceback (most recent call last): [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] self.driver.spawn(context, instance, image_meta, [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] self._fetch_image_if_missing(context, vi) [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] image_cache(vi, tmp_image_ds_loc) [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] vm_util.copy_virtual_disk( [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] session._wait_for_task(vmdk_copy_task) [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] return self.wait_for_task(task_ref) [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] return evt.wait() [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] result = hub.switch() [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] return self.greenlet.switch() [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] self.f(*self.args, **self.kw) [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] raise exceptions.translate_fault(task_info.error) [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] Faults: ['InvalidArgument'] [ 1459.147832] env[68964]: ERROR nova.compute.manager [instance: 244140d1-bf22-415a-b770-05f2fe106149] [ 1459.148693] env[68964]: DEBUG nova.compute.utils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1459.149397] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Build of instance 244140d1-bf22-415a-b770-05f2fe106149 was re-scheduled: A specified parameter was not correct: fileType [ 1459.149397] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1459.149620] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1459.149802] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1459.149974] env[68964]: DEBUG nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1459.150152] env[68964]: DEBUG nova.network.neutron [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1459.506768] env[68964]: DEBUG nova.network.neutron [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1459.519358] env[68964]: INFO nova.compute.manager [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Took 0.37 seconds to deallocate network for instance. [ 1459.622961] env[68964]: INFO nova.scheduler.client.report [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Deleted allocations for instance 244140d1-bf22-415a-b770-05f2fe106149 [ 1459.650807] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d7828755-da5f-485c-8d08-9308a81959ef tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "244140d1-bf22-415a-b770-05f2fe106149" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 487.035s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.652349] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "244140d1-bf22-415a-b770-05f2fe106149" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 291.017s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1459.652574] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Acquiring lock "244140d1-bf22-415a-b770-05f2fe106149-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1459.652794] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "244140d1-bf22-415a-b770-05f2fe106149-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1459.652940] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "244140d1-bf22-415a-b770-05f2fe106149-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.655415] env[68964]: INFO nova.compute.manager [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Terminating instance [ 1459.657934] env[68964]: DEBUG nova.compute.manager [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1459.658236] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1459.658872] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c3a9f648-22be-466c-9b23-c11f939eb20f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.664849] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1459.671634] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf890d6a-7b0e-4310-b837-254dc150ebda {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.706679] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 244140d1-bf22-415a-b770-05f2fe106149 could not be found. [ 1459.706854] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1459.706966] env[68964]: INFO nova.compute.manager [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1459.707242] env[68964]: DEBUG oslo.service.loopingcall [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1459.709956] env[68964]: DEBUG nova.compute.manager [-] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1459.710076] env[68964]: DEBUG nova.network.neutron [-] [instance: 244140d1-bf22-415a-b770-05f2fe106149] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1459.725427] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1459.725737] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1459.727517] env[68964]: INFO nova.compute.claims [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1459.740031] env[68964]: DEBUG nova.network.neutron [-] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1459.761785] env[68964]: INFO nova.compute.manager [-] [instance: 244140d1-bf22-415a-b770-05f2fe106149] Took 0.05 seconds to deallocate network for instance. [ 1459.868884] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9cbd3251-63ce-4cad-b2f1-9ea979390622 tempest-InstanceActionsTestJSON-2114764681 tempest-InstanceActionsTestJSON-2114764681-project-member] Lock "244140d1-bf22-415a-b770-05f2fe106149" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.216s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.870393] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "244140d1-bf22-415a-b770-05f2fe106149" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 212.248s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1459.870393] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 244140d1-bf22-415a-b770-05f2fe106149] During sync_power_state the instance has a pending task (deleting). Skip. [ 1459.870605] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "244140d1-bf22-415a-b770-05f2fe106149" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1459.969397] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f959ba4-23a9-41fb-8b48-7f7beab715c3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1459.978232] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5b86a7e-ad61-4c3c-83c5-7c3a273aa290 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.028108] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52b668cb-a409-4c1b-89f6-dea50e5cfb7c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.039418] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9f5e38e-6d8c-4516-9852-1b3bbdb4db9b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.061859] env[68964]: DEBUG nova.compute.provider_tree [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1460.072021] env[68964]: DEBUG nova.scheduler.client.report [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1460.088200] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1460.088677] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1460.125403] env[68964]: DEBUG nova.compute.utils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1460.127067] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1460.127067] env[68964]: DEBUG nova.network.neutron [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1460.137423] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1460.206693] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1460.226230] env[68964]: DEBUG nova.policy [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21b5b62c1d9a4afc8e26b122ce6de51c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '07b4913b8fef4ee3a0d920bc36fefd18', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1460.237505] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1460.237915] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1460.237986] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1460.238282] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1460.238535] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1460.238748] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1460.239069] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1460.239305] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1460.239566] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1460.239759] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1460.239971] env[68964]: DEBUG nova.virt.hardware [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1460.240994] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c08a17d-a08d-4a6a-b306-61d416c3731d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.251406] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2291370-25d6-47f3-951d-a73c953f35bf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1460.559615] env[68964]: DEBUG nova.network.neutron [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Successfully created port: 9bb51ffa-2df7-42c7-83f8-1be10c949380 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1461.288362] env[68964]: DEBUG nova.network.neutron [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Successfully updated port: 9bb51ffa-2df7-42c7-83f8-1be10c949380 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1461.301070] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "refresh_cache-dc39aed1-9371-469b-b43e-40ce313c8ab3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1461.302072] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "refresh_cache-dc39aed1-9371-469b-b43e-40ce313c8ab3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1461.302072] env[68964]: DEBUG nova.network.neutron [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1461.358540] env[68964]: DEBUG nova.network.neutron [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1461.551949] env[68964]: DEBUG nova.compute.manager [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Received event network-vif-plugged-9bb51ffa-2df7-42c7-83f8-1be10c949380 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1461.552187] env[68964]: DEBUG oslo_concurrency.lockutils [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] Acquiring lock "dc39aed1-9371-469b-b43e-40ce313c8ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1461.552388] env[68964]: DEBUG oslo_concurrency.lockutils [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1461.552553] env[68964]: DEBUG oslo_concurrency.lockutils [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1461.552713] env[68964]: DEBUG nova.compute.manager [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] No waiting events found dispatching network-vif-plugged-9bb51ffa-2df7-42c7-83f8-1be10c949380 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1461.552874] env[68964]: WARNING nova.compute.manager [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Received unexpected event network-vif-plugged-9bb51ffa-2df7-42c7-83f8-1be10c949380 for instance with vm_state building and task_state spawning. [ 1461.553039] env[68964]: DEBUG nova.compute.manager [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Received event network-changed-9bb51ffa-2df7-42c7-83f8-1be10c949380 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1461.553193] env[68964]: DEBUG nova.compute.manager [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Refreshing instance network info cache due to event network-changed-9bb51ffa-2df7-42c7-83f8-1be10c949380. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1461.553486] env[68964]: DEBUG oslo_concurrency.lockutils [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] Acquiring lock "refresh_cache-dc39aed1-9371-469b-b43e-40ce313c8ab3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1461.577601] env[68964]: DEBUG nova.network.neutron [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Updating instance_info_cache with network_info: [{"id": "9bb51ffa-2df7-42c7-83f8-1be10c949380", "address": "fa:16:3e:c9:a9:30", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9bb51ffa-2d", "ovs_interfaceid": "9bb51ffa-2df7-42c7-83f8-1be10c949380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1461.589764] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "refresh_cache-dc39aed1-9371-469b-b43e-40ce313c8ab3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1461.590051] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Instance network_info: |[{"id": "9bb51ffa-2df7-42c7-83f8-1be10c949380", "address": "fa:16:3e:c9:a9:30", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9bb51ffa-2d", "ovs_interfaceid": "9bb51ffa-2df7-42c7-83f8-1be10c949380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1461.590336] env[68964]: DEBUG oslo_concurrency.lockutils [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] Acquired lock "refresh_cache-dc39aed1-9371-469b-b43e-40ce313c8ab3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1461.590509] env[68964]: DEBUG nova.network.neutron [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Refreshing network info cache for port 9bb51ffa-2df7-42c7-83f8-1be10c949380 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1461.591506] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c9:a9:30', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92fe29b3-0907-453d-aabb-5559c4bd7c0f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9bb51ffa-2df7-42c7-83f8-1be10c949380', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1461.599176] env[68964]: DEBUG oslo.service.loopingcall [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1461.599929] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1461.602188] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a170b570-9d6c-4eb7-82d3-a43c34b2699f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1461.622962] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1461.622962] env[68964]: value = "task-3431721" [ 1461.622962] env[68964]: _type = "Task" [ 1461.622962] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1461.632256] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431721, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1461.940390] env[68964]: DEBUG nova.network.neutron [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Updated VIF entry in instance network info cache for port 9bb51ffa-2df7-42c7-83f8-1be10c949380. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1461.940752] env[68964]: DEBUG nova.network.neutron [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Updating instance_info_cache with network_info: [{"id": "9bb51ffa-2df7-42c7-83f8-1be10c949380", "address": "fa:16:3e:c9:a9:30", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9bb51ffa-2d", "ovs_interfaceid": "9bb51ffa-2df7-42c7-83f8-1be10c949380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1461.951367] env[68964]: DEBUG oslo_concurrency.lockutils [req-2abceca2-8345-436d-8963-e7a0e82c2aad req-31674552-d51b-4bca-be3e-e1094c423052 service nova] Releasing lock "refresh_cache-dc39aed1-9371-469b-b43e-40ce313c8ab3" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1462.133128] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431721, 'name': CreateVM_Task, 'duration_secs': 0.290401} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1462.133259] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1462.133946] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1462.134124] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1462.134450] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1462.134698] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f7559fa3-6e02-4a18-8b38-787a43bbb49f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.139082] env[68964]: DEBUG oslo_vmware.api [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 1462.139082] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5279da03-f0cb-0071-4acc-c42352074f43" [ 1462.139082] env[68964]: _type = "Task" [ 1462.139082] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1462.146363] env[68964]: DEBUG oslo_vmware.api [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5279da03-f0cb-0071-4acc-c42352074f43, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1462.648927] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1462.649216] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1462.649320] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1468.813344] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquiring lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1468.813653] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1470.364362] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "2243b807-c2a0-4917-aae8-5de31dc52e53" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1470.364782] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1478.744138] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "8f94d3c8-4674-463d-8829-68a184967183" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1478.744401] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "8f94d3c8-4674-463d-8829-68a184967183" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1486.280991] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquiring lock "3d41d454-f370-46a6-ba97-17f5553d557c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1492.139980] env[68964]: DEBUG oslo_concurrency.lockutils [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "dc39aed1-9371-469b-b43e-40ce313c8ab3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1497.535993] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e878b2ba-c220-4e72-bf52-cac9dce82d18 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Acquiring lock "c3065e70-0bec-4b15-ae2d-fee36304f41e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1497.536289] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e878b2ba-c220-4e72-bf52-cac9dce82d18 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "c3065e70-0bec-4b15-ae2d-fee36304f41e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1507.132773] env[68964]: WARNING oslo_vmware.rw_handles [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1507.132773] env[68964]: ERROR oslo_vmware.rw_handles [ 1507.133381] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/789012f4-1774-4403-abf0-8a9c1bdecab2/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1507.135215] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1507.135448] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Copying Virtual Disk [datastore1] vmware_temp/789012f4-1774-4403-abf0-8a9c1bdecab2/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/789012f4-1774-4403-abf0-8a9c1bdecab2/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1507.135731] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-be26a506-91fd-4a16-85d2-d73af5416b66 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.143138] env[68964]: DEBUG oslo_vmware.api [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Waiting for the task: (returnval){ [ 1507.143138] env[68964]: value = "task-3431722" [ 1507.143138] env[68964]: _type = "Task" [ 1507.143138] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1507.150829] env[68964]: DEBUG oslo_vmware.api [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Task: {'id': task-3431722, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1507.653731] env[68964]: DEBUG oslo_vmware.exceptions [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1507.654058] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1507.654637] env[68964]: ERROR nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1507.654637] env[68964]: Faults: ['InvalidArgument'] [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Traceback (most recent call last): [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] yield resources [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] self.driver.spawn(context, instance, image_meta, [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] self._fetch_image_if_missing(context, vi) [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] image_cache(vi, tmp_image_ds_loc) [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] vm_util.copy_virtual_disk( [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] session._wait_for_task(vmdk_copy_task) [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] return self.wait_for_task(task_ref) [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] return evt.wait() [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] result = hub.switch() [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] return self.greenlet.switch() [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] self.f(*self.args, **self.kw) [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] raise exceptions.translate_fault(task_info.error) [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Faults: ['InvalidArgument'] [ 1507.654637] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] [ 1507.655739] env[68964]: INFO nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Terminating instance [ 1507.656544] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1507.656698] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1507.656931] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b2335e3f-ce2f-40c2-b360-5a1d5042c8f2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.659349] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1507.659536] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1507.660263] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d67b6624-fa3f-446a-b38c-393dad501b66 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.667086] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1507.667348] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4bdc4388-cdba-432b-93e0-32bbacae8da9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.669478] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1507.669650] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1507.670613] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-362717b2-f75c-43b1-915b-07869da71c47 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.675643] env[68964]: DEBUG oslo_vmware.api [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Waiting for the task: (returnval){ [ 1507.675643] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52042f28-4b81-362b-83db-d5f695f11494" [ 1507.675643] env[68964]: _type = "Task" [ 1507.675643] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1507.684940] env[68964]: DEBUG oslo_vmware.api [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52042f28-4b81-362b-83db-d5f695f11494, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1507.724547] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1507.740294] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1507.740499] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1507.740664] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Deleting the datastore file [datastore1] 96c1b70b-9a17-46b1-999d-558b85c77d22 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1507.740914] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fd047bd2-134e-4a7b-b701-e790ececa6ad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.746950] env[68964]: DEBUG oslo_vmware.api [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Waiting for the task: (returnval){ [ 1507.746950] env[68964]: value = "task-3431724" [ 1507.746950] env[68964]: _type = "Task" [ 1507.746950] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1507.754573] env[68964]: DEBUG oslo_vmware.api [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Task: {'id': task-3431724, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1508.186770] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1508.186770] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Creating directory with path [datastore1] vmware_temp/5348f236-f723-4c9d-933f-2e2966cdb0e6/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1508.187078] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-639ac2f9-1d8a-49a3-8526-fb2c4c61e181 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.197803] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Created directory with path [datastore1] vmware_temp/5348f236-f723-4c9d-933f-2e2966cdb0e6/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1508.197991] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Fetch image to [datastore1] vmware_temp/5348f236-f723-4c9d-933f-2e2966cdb0e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1508.198177] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/5348f236-f723-4c9d-933f-2e2966cdb0e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1508.198916] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-019b2cfd-8592-4963-9162-15e6142f0f96 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.205525] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86411551-2b90-4e58-a01c-ddc4ddec25fb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.215656] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47005653-04c3-4e14-b217-124926b86c3a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.247487] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-431b7e2c-b6e2-490e-8e64-6504c7870834 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.258123] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-af5c4832-cacf-4ca6-9070-1ffd52050b61 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.259739] env[68964]: DEBUG oslo_vmware.api [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Task: {'id': task-3431724, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06977} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1508.259963] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1508.260154] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1508.260322] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1508.260490] env[68964]: INFO nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1508.263213] env[68964]: DEBUG nova.compute.claims [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1508.263383] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1508.263595] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1508.284726] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1508.344573] env[68964]: DEBUG oslo_vmware.rw_handles [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5348f236-f723-4c9d-933f-2e2966cdb0e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1508.403693] env[68964]: DEBUG oslo_vmware.rw_handles [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1508.403899] env[68964]: DEBUG oslo_vmware.rw_handles [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5348f236-f723-4c9d-933f-2e2966cdb0e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1508.534254] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c15c5cc3-0950-48f4-8b2d-71603d07bf69 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.541198] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb1ffc20-4f5e-436e-9544-14060623953c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.571447] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1399400e-c678-494c-9c43-50007d3178f7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.577892] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3158bec-16cf-4ef1-af16-652423ce71c5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.590543] env[68964]: DEBUG nova.compute.provider_tree [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1508.598986] env[68964]: DEBUG nova.scheduler.client.report [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1508.612165] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.348s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1508.612673] env[68964]: ERROR nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1508.612673] env[68964]: Faults: ['InvalidArgument'] [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Traceback (most recent call last): [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] self.driver.spawn(context, instance, image_meta, [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] self._fetch_image_if_missing(context, vi) [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] image_cache(vi, tmp_image_ds_loc) [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] vm_util.copy_virtual_disk( [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] session._wait_for_task(vmdk_copy_task) [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] return self.wait_for_task(task_ref) [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] return evt.wait() [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] result = hub.switch() [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] return self.greenlet.switch() [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] self.f(*self.args, **self.kw) [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] raise exceptions.translate_fault(task_info.error) [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Faults: ['InvalidArgument'] [ 1508.612673] env[68964]: ERROR nova.compute.manager [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] [ 1508.613539] env[68964]: DEBUG nova.compute.utils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1508.615222] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Build of instance 96c1b70b-9a17-46b1-999d-558b85c77d22 was re-scheduled: A specified parameter was not correct: fileType [ 1508.615222] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1508.615586] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1508.615756] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1508.615922] env[68964]: DEBUG nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1508.616110] env[68964]: DEBUG nova.network.neutron [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1508.723899] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1508.936453] env[68964]: DEBUG nova.network.neutron [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1508.950046] env[68964]: INFO nova.compute.manager [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Took 0.33 seconds to deallocate network for instance. [ 1509.123882] env[68964]: INFO nova.scheduler.client.report [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Deleted allocations for instance 96c1b70b-9a17-46b1-999d-558b85c77d22 [ 1509.150710] env[68964]: DEBUG oslo_concurrency.lockutils [None req-207e0dc7-a826-42ba-ac7e-6c79c8f9383b tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 438.483s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1509.152213] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 261.530s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1509.152425] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] During sync_power_state the instance has a pending task (spawning). Skip. [ 1509.152607] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1509.153215] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 242.621s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1509.153433] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Acquiring lock "96c1b70b-9a17-46b1-999d-558b85c77d22-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1509.153634] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1509.153795] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1509.156317] env[68964]: INFO nova.compute.manager [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Terminating instance [ 1509.158357] env[68964]: DEBUG nova.compute.manager [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1509.158571] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1509.158835] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-baab78a5-d550-48ed-bb7b-aa6d1fd4ccbf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.162270] env[68964]: DEBUG nova.compute.manager [None req-2c394477-3fc3-4e69-9c83-c3c767d95fe3 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: fbf98cba-22ba-4ad6-8d97-59bbbbf56e90] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1509.172708] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56d78882-5d39-4b4f-86b6-dc456d14f900 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.188658] env[68964]: DEBUG nova.compute.manager [None req-2c394477-3fc3-4e69-9c83-c3c767d95fe3 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: fbf98cba-22ba-4ad6-8d97-59bbbbf56e90] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1509.202960] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 96c1b70b-9a17-46b1-999d-558b85c77d22 could not be found. [ 1509.203195] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1509.203375] env[68964]: INFO nova.compute.manager [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1509.203619] env[68964]: DEBUG oslo.service.loopingcall [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1509.205792] env[68964]: DEBUG nova.compute.manager [-] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1509.205918] env[68964]: DEBUG nova.network.neutron [-] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1509.217267] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2c394477-3fc3-4e69-9c83-c3c767d95fe3 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "fbf98cba-22ba-4ad6-8d97-59bbbbf56e90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.602s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1509.228517] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1509.239915] env[68964]: DEBUG nova.network.neutron [-] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1509.256694] env[68964]: INFO nova.compute.manager [-] [instance: 96c1b70b-9a17-46b1-999d-558b85c77d22] Took 0.05 seconds to deallocate network for instance. [ 1509.289284] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1509.289560] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1509.291189] env[68964]: INFO nova.compute.claims [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1509.375837] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f2236ab7-d0ee-4d9f-bd0e-38cdeaf31ba3 tempest-AttachInterfacesV270Test-335496400 tempest-AttachInterfacesV270Test-335496400-project-member] Lock "96c1b70b-9a17-46b1-999d-558b85c77d22" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.221s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1509.541267] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15ab3358-a955-4426-be29-7994dd5dda85 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.548987] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e6e6594-8a3b-4238-9871-a55aa8016937 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.579162] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d616bb4a-d1e3-4108-a1ba-fdb6a899706c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.586525] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d115d9e-0856-4bcc-a068-de1dc67e0a73 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.600674] env[68964]: DEBUG nova.compute.provider_tree [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1509.609396] env[68964]: DEBUG nova.scheduler.client.report [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1509.623430] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1509.623908] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1509.662022] env[68964]: DEBUG nova.compute.utils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1509.662022] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1509.662022] env[68964]: DEBUG nova.network.neutron [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1509.673374] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1509.724522] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1509.724692] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1509.736088] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1509.762508] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1509.762508] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1509.762508] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1509.762508] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1509.762690] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1509.762786] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1509.762994] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1509.763379] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1509.763609] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1509.763819] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1509.764054] env[68964]: DEBUG nova.virt.hardware [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1509.764933] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1be425f7-4403-49c1-a279-526c1bcb31d5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1509.768768] env[68964]: DEBUG nova.policy [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f280ec50678e4f94ab51c3880dcb7ffa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd296b86656f48ae9c63c85c5945adab', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1509.775966] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68a902d6-814c-4e4f-b424-751463ce9656 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1510.476421] env[68964]: DEBUG nova.network.neutron [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Successfully created port: 2e8ba2cc-74f4-4acc-a55f-b2592cef72d4 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1510.730057] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1510.730349] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1510.744362] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] There are 0 instances to clean {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1510.744561] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1510.744698] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances with incomplete migration {{(pid=68964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1511.774292] env[68964]: DEBUG nova.compute.manager [req-b8f61b35-d5d4-433d-9e8d-78a4b5f8ea43 req-dc747bf1-b84b-48b1-b339-194540ee19d2 service nova] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Received event network-vif-plugged-2e8ba2cc-74f4-4acc-a55f-b2592cef72d4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1511.774579] env[68964]: DEBUG oslo_concurrency.lockutils [req-b8f61b35-d5d4-433d-9e8d-78a4b5f8ea43 req-dc747bf1-b84b-48b1-b339-194540ee19d2 service nova] Acquiring lock "094b1346-f24b-4360-b7c8-46fd2f2c668f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1511.774694] env[68964]: DEBUG oslo_concurrency.lockutils [req-b8f61b35-d5d4-433d-9e8d-78a4b5f8ea43 req-dc747bf1-b84b-48b1-b339-194540ee19d2 service nova] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1511.774859] env[68964]: DEBUG oslo_concurrency.lockutils [req-b8f61b35-d5d4-433d-9e8d-78a4b5f8ea43 req-dc747bf1-b84b-48b1-b339-194540ee19d2 service nova] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1511.775240] env[68964]: DEBUG nova.compute.manager [req-b8f61b35-d5d4-433d-9e8d-78a4b5f8ea43 req-dc747bf1-b84b-48b1-b339-194540ee19d2 service nova] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] No waiting events found dispatching network-vif-plugged-2e8ba2cc-74f4-4acc-a55f-b2592cef72d4 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1511.775504] env[68964]: WARNING nova.compute.manager [req-b8f61b35-d5d4-433d-9e8d-78a4b5f8ea43 req-dc747bf1-b84b-48b1-b339-194540ee19d2 service nova] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Received unexpected event network-vif-plugged-2e8ba2cc-74f4-4acc-a55f-b2592cef72d4 for instance with vm_state building and task_state spawning. [ 1511.780663] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1511.895844] env[68964]: DEBUG nova.network.neutron [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Successfully updated port: 2e8ba2cc-74f4-4acc-a55f-b2592cef72d4 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1511.911208] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquiring lock "refresh_cache-094b1346-f24b-4360-b7c8-46fd2f2c668f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1511.911360] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquired lock "refresh_cache-094b1346-f24b-4360-b7c8-46fd2f2c668f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1511.911511] env[68964]: DEBUG nova.network.neutron [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1511.973911] env[68964]: DEBUG nova.network.neutron [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1512.131756] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquiring lock "094b1346-f24b-4360-b7c8-46fd2f2c668f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1512.267964] env[68964]: DEBUG nova.network.neutron [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Updating instance_info_cache with network_info: [{"id": "2e8ba2cc-74f4-4acc-a55f-b2592cef72d4", "address": "fa:16:3e:22:57:94", "network": {"id": "841e9e72-3820-421f-abc4-4b6ceb35f3b4", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1344877995-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd296b86656f48ae9c63c85c5945adab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e8ba2cc-74", "ovs_interfaceid": "2e8ba2cc-74f4-4acc-a55f-b2592cef72d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1512.295470] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Releasing lock "refresh_cache-094b1346-f24b-4360-b7c8-46fd2f2c668f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1512.295470] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Instance network_info: |[{"id": "2e8ba2cc-74f4-4acc-a55f-b2592cef72d4", "address": "fa:16:3e:22:57:94", "network": {"id": "841e9e72-3820-421f-abc4-4b6ceb35f3b4", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1344877995-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd296b86656f48ae9c63c85c5945adab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e8ba2cc-74", "ovs_interfaceid": "2e8ba2cc-74f4-4acc-a55f-b2592cef72d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1512.295470] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:22:57:94', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3d31a554-a94c-4471-892f-f65aa87b8279', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2e8ba2cc-74f4-4acc-a55f-b2592cef72d4', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1512.302461] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Creating folder: Project (bd296b86656f48ae9c63c85c5945adab). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1512.303212] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4a311ec5-4280-4071-a721-809b48be6691 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1512.316525] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Created folder: Project (bd296b86656f48ae9c63c85c5945adab) in parent group-v684465. [ 1512.316996] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Creating folder: Instances. Parent ref: group-v684585. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1512.317387] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-095325f9-1492-4513-a79f-38043a1c4fed {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1512.327651] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Created folder: Instances in parent group-v684585. [ 1512.328044] env[68964]: DEBUG oslo.service.loopingcall [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1512.328337] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1512.328667] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f388c417-8994-4d53-a5f8-6b010f3a7fa5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1512.348645] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1512.348645] env[68964]: value = "task-3431727" [ 1512.348645] env[68964]: _type = "Task" [ 1512.348645] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1512.357562] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431727, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1512.719034] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1512.723651] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1512.723801] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1512.723921] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1512.744928] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.745252] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 704ec14b-410e-4175-b032-69074b332d87] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.745407] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.745536] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.745659] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.745779] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.745899] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.746031] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.746153] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.746531] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1512.746728] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1512.861239] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431727, 'name': CreateVM_Task, 'duration_secs': 0.430054} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1512.861239] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1512.861646] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1512.861816] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1512.862164] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1512.862427] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a09bc9e2-896e-4f70-b261-7e14a0333e42 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1512.867118] env[68964]: DEBUG oslo_vmware.api [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Waiting for the task: (returnval){ [ 1512.867118] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5213e904-4733-2e4b-6618-9e684fd7ca93" [ 1512.867118] env[68964]: _type = "Task" [ 1512.867118] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1512.875293] env[68964]: DEBUG oslo_vmware.api [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5213e904-4733-2e4b-6618-9e684fd7ca93, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1513.377154] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1513.377503] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1513.377724] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1513.724725] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1513.724850] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1513.724977] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1513.736371] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1513.736594] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1513.736757] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1513.736912] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1513.738013] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea281e4-5143-45e3-b44f-bf68f6f2c325 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.746736] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f10d767-9747-44a1-8216-b181b963be84 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.760695] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e784ce5-8921-4d68-b3ba-e9b9f240169f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.767205] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-224c2aec-bf3b-4c16-b0ae-9726ebe44a68 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.798144] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180945MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1513.798336] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1513.798537] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1513.802197] env[68964]: DEBUG nova.compute.manager [req-7c623ba3-c33e-4fef-927a-14760712a0bf req-7734661b-21c4-4e94-8760-c8473b56f36e service nova] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Received event network-changed-2e8ba2cc-74f4-4acc-a55f-b2592cef72d4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1513.802387] env[68964]: DEBUG nova.compute.manager [req-7c623ba3-c33e-4fef-927a-14760712a0bf req-7734661b-21c4-4e94-8760-c8473b56f36e service nova] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Refreshing instance network info cache due to event network-changed-2e8ba2cc-74f4-4acc-a55f-b2592cef72d4. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1513.802608] env[68964]: DEBUG oslo_concurrency.lockutils [req-7c623ba3-c33e-4fef-927a-14760712a0bf req-7734661b-21c4-4e94-8760-c8473b56f36e service nova] Acquiring lock "refresh_cache-094b1346-f24b-4360-b7c8-46fd2f2c668f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1513.802714] env[68964]: DEBUG oslo_concurrency.lockutils [req-7c623ba3-c33e-4fef-927a-14760712a0bf req-7734661b-21c4-4e94-8760-c8473b56f36e service nova] Acquired lock "refresh_cache-094b1346-f24b-4360-b7c8-46fd2f2c668f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1513.802868] env[68964]: DEBUG nova.network.neutron [req-7c623ba3-c33e-4fef-927a-14760712a0bf req-7734661b-21c4-4e94-8760-c8473b56f36e service nova] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Refreshing network info cache for port 2e8ba2cc-74f4-4acc-a55f-b2592cef72d4 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1513.876967] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.877292] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 704ec14b-410e-4175-b032-69074b332d87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.877292] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.877390] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.877501] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.877622] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.877739] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.877856] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3d41d454-f370-46a6-ba97-17f5553d557c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.877970] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.878095] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1513.890258] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1513.902462] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1513.913126] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1513.924959] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c3065e70-0bec-4b15-ae2d-fee36304f41e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1513.924959] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1513.924959] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1514.120637] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a2a60a7-f8c3-4d70-8f9d-aa0df63a2481 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.129459] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad02e344-03ca-4294-b896-b15efb23aa97 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.135590] env[68964]: DEBUG nova.network.neutron [req-7c623ba3-c33e-4fef-927a-14760712a0bf req-7734661b-21c4-4e94-8760-c8473b56f36e service nova] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Updated VIF entry in instance network info cache for port 2e8ba2cc-74f4-4acc-a55f-b2592cef72d4. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1514.136064] env[68964]: DEBUG nova.network.neutron [req-7c623ba3-c33e-4fef-927a-14760712a0bf req-7734661b-21c4-4e94-8760-c8473b56f36e service nova] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Updating instance_info_cache with network_info: [{"id": "2e8ba2cc-74f4-4acc-a55f-b2592cef72d4", "address": "fa:16:3e:22:57:94", "network": {"id": "841e9e72-3820-421f-abc4-4b6ceb35f3b4", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1344877995-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd296b86656f48ae9c63c85c5945adab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e8ba2cc-74", "ovs_interfaceid": "2e8ba2cc-74f4-4acc-a55f-b2592cef72d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1514.170043] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe516f56-f006-4be9-87a7-a78f5f0db1e4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.172976] env[68964]: DEBUG oslo_concurrency.lockutils [req-7c623ba3-c33e-4fef-927a-14760712a0bf req-7734661b-21c4-4e94-8760-c8473b56f36e service nova] Releasing lock "refresh_cache-094b1346-f24b-4360-b7c8-46fd2f2c668f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1514.178349] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ba85572-c505-4972-899d-8c31e52616c9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.193884] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1514.202054] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1514.217026] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1514.217214] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.419s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1516.217056] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1533.860714] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquiring lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1533.861165] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1558.199362] env[68964]: WARNING oslo_vmware.rw_handles [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1558.199362] env[68964]: ERROR oslo_vmware.rw_handles [ 1558.200181] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/5348f236-f723-4c9d-933f-2e2966cdb0e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1558.202152] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1558.202395] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Copying Virtual Disk [datastore1] vmware_temp/5348f236-f723-4c9d-933f-2e2966cdb0e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/5348f236-f723-4c9d-933f-2e2966cdb0e6/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1558.202672] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-feb7416e-cf8a-4e49-b527-799d5805bf4d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.210717] env[68964]: DEBUG oslo_vmware.api [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Waiting for the task: (returnval){ [ 1558.210717] env[68964]: value = "task-3431728" [ 1558.210717] env[68964]: _type = "Task" [ 1558.210717] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1558.218675] env[68964]: DEBUG oslo_vmware.api [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Task: {'id': task-3431728, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1558.721793] env[68964]: DEBUG oslo_vmware.exceptions [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1558.722076] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1558.722641] env[68964]: ERROR nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1558.722641] env[68964]: Faults: ['InvalidArgument'] [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Traceback (most recent call last): [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] yield resources [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] self.driver.spawn(context, instance, image_meta, [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] self._fetch_image_if_missing(context, vi) [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] image_cache(vi, tmp_image_ds_loc) [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] vm_util.copy_virtual_disk( [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] session._wait_for_task(vmdk_copy_task) [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] return self.wait_for_task(task_ref) [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] return evt.wait() [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] result = hub.switch() [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] return self.greenlet.switch() [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] self.f(*self.args, **self.kw) [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] raise exceptions.translate_fault(task_info.error) [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Faults: ['InvalidArgument'] [ 1558.722641] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] [ 1558.723582] env[68964]: INFO nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Terminating instance [ 1558.724417] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1558.724625] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1558.724856] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bf405ba0-f224-4b92-b12a-a03e688f2af5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.726818] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "refresh_cache-8ee9e517-075e-4faf-9740-32f8fa585eb5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1558.727030] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquired lock "refresh_cache-8ee9e517-075e-4faf-9740-32f8fa585eb5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1558.727216] env[68964]: DEBUG nova.network.neutron [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1558.733933] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1558.734116] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1558.734785] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f548c5b9-af11-44ab-b532-ca974817dd85 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.742187] env[68964]: DEBUG oslo_vmware.api [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 1558.742187] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52904c72-e8ab-3de9-b164-37445e6955ae" [ 1558.742187] env[68964]: _type = "Task" [ 1558.742187] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1558.749757] env[68964]: DEBUG oslo_vmware.api [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52904c72-e8ab-3de9-b164-37445e6955ae, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1558.755901] env[68964]: DEBUG nova.network.neutron [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1558.855433] env[68964]: DEBUG nova.network.neutron [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1558.863658] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Releasing lock "refresh_cache-8ee9e517-075e-4faf-9740-32f8fa585eb5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1558.864077] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1558.864283] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1558.865425] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ddb495e-fc91-4f3e-bb5e-06074621d2f9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.873316] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1558.873549] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e2082604-8efe-4d8d-ab12-cc97475e6f9e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.901210] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1558.901614] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1558.901614] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Deleting the datastore file [datastore1] 8ee9e517-075e-4faf-9740-32f8fa585eb5 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1558.901840] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d4c50a4e-2a92-4406-9270-52bca5af2cbb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1558.908169] env[68964]: DEBUG oslo_vmware.api [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Waiting for the task: (returnval){ [ 1558.908169] env[68964]: value = "task-3431730" [ 1558.908169] env[68964]: _type = "Task" [ 1558.908169] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1558.917472] env[68964]: DEBUG oslo_vmware.api [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Task: {'id': task-3431730, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1559.252862] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1559.253190] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating directory with path [datastore1] vmware_temp/368fca49-99c9-4afb-95f1-5d62ea17b0f1/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1559.253320] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9bceba4d-37a0-4328-b4ad-c98a2792086a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.263910] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Created directory with path [datastore1] vmware_temp/368fca49-99c9-4afb-95f1-5d62ea17b0f1/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1559.264135] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Fetch image to [datastore1] vmware_temp/368fca49-99c9-4afb-95f1-5d62ea17b0f1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1559.264337] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/368fca49-99c9-4afb-95f1-5d62ea17b0f1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1559.265041] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5310c3f-c220-4640-b540-fad13325094f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.271466] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f115f2ab-847b-4c30-a9aa-8c777daea839 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.280268] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7743a372-e85f-4b4d-8d67-ac93ff855a2a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.310348] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b149080-c5e1-4324-9a89-f97d251bbdd5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.315692] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-46377b72-8181-4952-b714-6c13e1f57ab7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.338166] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1559.386607] env[68964]: DEBUG oslo_vmware.rw_handles [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/368fca49-99c9-4afb-95f1-5d62ea17b0f1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1559.446381] env[68964]: DEBUG oslo_vmware.rw_handles [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1559.446564] env[68964]: DEBUG oslo_vmware.rw_handles [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/368fca49-99c9-4afb-95f1-5d62ea17b0f1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1559.450678] env[68964]: DEBUG oslo_vmware.api [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Task: {'id': task-3431730, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.031352} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1559.450967] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1559.451184] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1559.451360] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1559.451531] env[68964]: INFO nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1559.451776] env[68964]: DEBUG oslo.service.loopingcall [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1559.451995] env[68964]: DEBUG nova.compute.manager [-] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Skipping network deallocation for instance since networking was not requested. {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1559.454329] env[68964]: DEBUG nova.compute.claims [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1559.454496] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1559.454716] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1559.509443] env[68964]: DEBUG nova.scheduler.client.report [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Refreshing inventories for resource provider 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1559.524056] env[68964]: DEBUG nova.scheduler.client.report [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Updating ProviderTree inventory for provider 63b0294e-f555-48a6-a542-3466427066a9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1559.524329] env[68964]: DEBUG nova.compute.provider_tree [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Updating inventory in ProviderTree for provider 63b0294e-f555-48a6-a542-3466427066a9 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1559.534665] env[68964]: DEBUG nova.scheduler.client.report [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Refreshing aggregate associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, aggregates: None {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1559.551542] env[68964]: DEBUG nova.scheduler.client.report [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Refreshing trait associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1559.707793] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66c5e003-b675-4094-9c29-004de6684bab {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.715396] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9663980-1f78-47d8-85b2-edb415dea393 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.745733] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0127c1d-88fd-432e-9d2e-c53df97f3ecc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.752463] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-987d14cf-2a90-4099-86e1-6e9ac516703c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1559.765403] env[68964]: DEBUG nova.compute.provider_tree [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1559.773361] env[68964]: DEBUG nova.scheduler.client.report [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1559.788391] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.334s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1559.788926] env[68964]: ERROR nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1559.788926] env[68964]: Faults: ['InvalidArgument'] [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Traceback (most recent call last): [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] self.driver.spawn(context, instance, image_meta, [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] self._fetch_image_if_missing(context, vi) [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] image_cache(vi, tmp_image_ds_loc) [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] vm_util.copy_virtual_disk( [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] session._wait_for_task(vmdk_copy_task) [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] return self.wait_for_task(task_ref) [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] return evt.wait() [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] result = hub.switch() [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] return self.greenlet.switch() [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] self.f(*self.args, **self.kw) [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] raise exceptions.translate_fault(task_info.error) [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Faults: ['InvalidArgument'] [ 1559.788926] env[68964]: ERROR nova.compute.manager [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] [ 1559.789843] env[68964]: DEBUG nova.compute.utils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1559.790983] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Build of instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 was re-scheduled: A specified parameter was not correct: fileType [ 1559.790983] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1559.791365] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1559.791581] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "refresh_cache-8ee9e517-075e-4faf-9740-32f8fa585eb5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1559.791728] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquired lock "refresh_cache-8ee9e517-075e-4faf-9740-32f8fa585eb5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1559.791884] env[68964]: DEBUG nova.network.neutron [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1559.814976] env[68964]: DEBUG nova.network.neutron [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1559.871709] env[68964]: DEBUG nova.network.neutron [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1559.880065] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Releasing lock "refresh_cache-8ee9e517-075e-4faf-9740-32f8fa585eb5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1559.880296] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1559.880476] env[68964]: DEBUG nova.compute.manager [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Skipping network deallocation for instance since networking was not requested. {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1559.964739] env[68964]: INFO nova.scheduler.client.report [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Deleted allocations for instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 [ 1559.985893] env[68964]: DEBUG oslo_concurrency.lockutils [None req-84ce825e-4b0b-4e8c-8d9e-ab68c106ef80 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "8ee9e517-075e-4faf-9740-32f8fa585eb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 486.431s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1559.987044] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "8ee9e517-075e-4faf-9740-32f8fa585eb5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 312.365s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1559.987240] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] During sync_power_state the instance has a pending task (spawning). Skip. [ 1559.987411] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "8ee9e517-075e-4faf-9740-32f8fa585eb5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1559.988047] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "8ee9e517-075e-4faf-9740-32f8fa585eb5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 290.507s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1559.988284] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "8ee9e517-075e-4faf-9740-32f8fa585eb5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1559.988487] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "8ee9e517-075e-4faf-9740-32f8fa585eb5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1559.988647] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "8ee9e517-075e-4faf-9740-32f8fa585eb5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1559.990465] env[68964]: INFO nova.compute.manager [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Terminating instance [ 1559.992048] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquiring lock "refresh_cache-8ee9e517-075e-4faf-9740-32f8fa585eb5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1559.992210] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Acquired lock "refresh_cache-8ee9e517-075e-4faf-9740-32f8fa585eb5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1559.992376] env[68964]: DEBUG nova.network.neutron [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1559.997562] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1560.018564] env[68964]: DEBUG nova.network.neutron [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1560.050630] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1560.050876] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1560.052450] env[68964]: INFO nova.compute.claims [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1560.091305] env[68964]: DEBUG nova.network.neutron [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1560.101266] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Releasing lock "refresh_cache-8ee9e517-075e-4faf-9740-32f8fa585eb5" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1560.101319] env[68964]: DEBUG nova.compute.manager [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1560.102036] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1560.102036] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f23b3a85-e08e-4232-b30a-4d66ba0ba9be {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.112266] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f28732c7-3685-4d5b-8e78-0fabcf30c1f3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.143518] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8ee9e517-075e-4faf-9740-32f8fa585eb5 could not be found. [ 1560.143518] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1560.143746] env[68964]: INFO nova.compute.manager [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1560.144040] env[68964]: DEBUG oslo.service.loopingcall [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1560.146501] env[68964]: DEBUG nova.compute.manager [-] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1560.146653] env[68964]: DEBUG nova.network.neutron [-] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1560.168635] env[68964]: DEBUG nova.network.neutron [-] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1560.175849] env[68964]: DEBUG nova.network.neutron [-] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1560.186036] env[68964]: INFO nova.compute.manager [-] [instance: 8ee9e517-075e-4faf-9740-32f8fa585eb5] Took 0.04 seconds to deallocate network for instance. [ 1560.278971] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0e835346-5c51-47cb-962b-45d3ac367ed8 tempest-ServersAaction247Test-1033986783 tempest-ServersAaction247Test-1033986783-project-member] Lock "8ee9e517-075e-4faf-9740-32f8fa585eb5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.291s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1560.298057] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea11e1c0-5361-4966-9456-68e148979738 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.305663] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-355c4ef8-c8b9-4679-a4fb-745751587cdd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.335192] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-993bbe5c-ca60-44b1-b3d0-1cdeed594b68 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.341866] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1a51397-ae0f-4998-af1c-91bfc9edd2c3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.355404] env[68964]: DEBUG nova.compute.provider_tree [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1560.365035] env[68964]: DEBUG nova.scheduler.client.report [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1560.379388] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.328s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1560.394685] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquiring lock "0495eec7-305b-44a1-90ec-8e6bca83fe82" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1560.394938] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "0495eec7-305b-44a1-90ec-8e6bca83fe82" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1560.399460] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "0495eec7-305b-44a1-90ec-8e6bca83fe82" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.004s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1560.399904] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1560.436145] env[68964]: DEBUG nova.compute.utils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1560.437467] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1560.437636] env[68964]: DEBUG nova.network.neutron [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1560.446295] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1560.508268] env[68964]: DEBUG nova.policy [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d53aa73a7f64e10a947c9816dd5799c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0e12b3903714e7e87247a8a42cf1a7a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1560.511360] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1560.535873] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1560.536161] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1560.536369] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1560.536608] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1560.536808] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1560.537032] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1560.537425] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1560.537506] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1560.537671] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1560.537834] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1560.538042] env[68964]: DEBUG nova.virt.hardware [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1560.538873] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66027967-a8f7-4cd9-98a2-2761177fe9b9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.547049] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8c6d09b-f1f0-478a-aff6-6ae1943f2ee0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.826828] env[68964]: DEBUG nova.network.neutron [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Successfully created port: 1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1561.635401] env[68964]: DEBUG nova.compute.manager [req-baec677c-0cac-4126-b05b-482f4b75cc89 req-e98c76dd-0c77-4e7e-bfe6-f65e2de39544 service nova] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Received event network-vif-plugged-1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1561.635663] env[68964]: DEBUG oslo_concurrency.lockutils [req-baec677c-0cac-4126-b05b-482f4b75cc89 req-e98c76dd-0c77-4e7e-bfe6-f65e2de39544 service nova] Acquiring lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1561.635818] env[68964]: DEBUG oslo_concurrency.lockutils [req-baec677c-0cac-4126-b05b-482f4b75cc89 req-e98c76dd-0c77-4e7e-bfe6-f65e2de39544 service nova] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1561.636063] env[68964]: DEBUG oslo_concurrency.lockutils [req-baec677c-0cac-4126-b05b-482f4b75cc89 req-e98c76dd-0c77-4e7e-bfe6-f65e2de39544 service nova] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1561.636249] env[68964]: DEBUG nova.compute.manager [req-baec677c-0cac-4126-b05b-482f4b75cc89 req-e98c76dd-0c77-4e7e-bfe6-f65e2de39544 service nova] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] No waiting events found dispatching network-vif-plugged-1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1561.636413] env[68964]: WARNING nova.compute.manager [req-baec677c-0cac-4126-b05b-482f4b75cc89 req-e98c76dd-0c77-4e7e-bfe6-f65e2de39544 service nova] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Received unexpected event network-vif-plugged-1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d for instance with vm_state building and task_state spawning. [ 1561.776795] env[68964]: DEBUG nova.network.neutron [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Successfully updated port: 1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1561.791596] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquiring lock "refresh_cache-8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1561.791752] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquired lock "refresh_cache-8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1561.791902] env[68964]: DEBUG nova.network.neutron [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1561.843538] env[68964]: DEBUG nova.network.neutron [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1561.999782] env[68964]: DEBUG nova.network.neutron [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Updating instance_info_cache with network_info: [{"id": "1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d", "address": "fa:16:3e:18:ec:b9", "network": {"id": "65c9f526-15b6-4513-82c3-2ef267c8d2c4", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-93366620-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f0e12b3903714e7e87247a8a42cf1a7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "424fd631-4456-4ce2-8924-a2ed81d60bd6", "external-id": "nsx-vlan-transportzone-19", "segmentation_id": 19, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d7a8a3f-54", "ovs_interfaceid": "1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1562.019424] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Releasing lock "refresh_cache-8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1562.019746] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Instance network_info: |[{"id": "1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d", "address": "fa:16:3e:18:ec:b9", "network": {"id": "65c9f526-15b6-4513-82c3-2ef267c8d2c4", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-93366620-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f0e12b3903714e7e87247a8a42cf1a7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "424fd631-4456-4ce2-8924-a2ed81d60bd6", "external-id": "nsx-vlan-transportzone-19", "segmentation_id": 19, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d7a8a3f-54", "ovs_interfaceid": "1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1562.020142] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:18:ec:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '424fd631-4456-4ce2-8924-a2ed81d60bd6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1562.029909] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Creating folder: Project (f0e12b3903714e7e87247a8a42cf1a7a). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1562.030501] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-144da8c2-d5e4-4b74-a2b0-d8bf2594b18f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.040743] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Created folder: Project (f0e12b3903714e7e87247a8a42cf1a7a) in parent group-v684465. [ 1562.040934] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Creating folder: Instances. Parent ref: group-v684588. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1562.041183] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-551f9135-965f-4352-a874-7041073050fe {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.049558] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Created folder: Instances in parent group-v684588. [ 1562.049781] env[68964]: DEBUG oslo.service.loopingcall [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1562.049964] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1562.050183] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3db8fa78-c12e-4297-a6cf-ff62d8d8bc00 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.069137] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1562.069137] env[68964]: value = "task-3431733" [ 1562.069137] env[68964]: _type = "Task" [ 1562.069137] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1562.078708] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431733, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1562.579481] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431733, 'name': CreateVM_Task, 'duration_secs': 0.301923} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1562.579664] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1562.580699] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1562.580699] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1562.580990] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1562.581464] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-892ed6d9-e141-452b-be19-eb2aeb5aa18c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.586677] env[68964]: DEBUG oslo_vmware.api [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Waiting for the task: (returnval){ [ 1562.586677] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]528727f9-2e68-db1b-5748-1de764fe05e0" [ 1562.586677] env[68964]: _type = "Task" [ 1562.586677] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1562.594877] env[68964]: DEBUG oslo_vmware.api [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]528727f9-2e68-db1b-5748-1de764fe05e0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1563.097049] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1563.097346] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1563.097551] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1563.733496] env[68964]: DEBUG nova.compute.manager [req-2dd37984-373a-4455-9007-54a27d9e8453 req-4b9fa58d-5240-4f12-b083-98f6f4b8c38c service nova] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Received event network-changed-1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1563.733496] env[68964]: DEBUG nova.compute.manager [req-2dd37984-373a-4455-9007-54a27d9e8453 req-4b9fa58d-5240-4f12-b083-98f6f4b8c38c service nova] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Refreshing instance network info cache due to event network-changed-1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1563.733659] env[68964]: DEBUG oslo_concurrency.lockutils [req-2dd37984-373a-4455-9007-54a27d9e8453 req-4b9fa58d-5240-4f12-b083-98f6f4b8c38c service nova] Acquiring lock "refresh_cache-8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1563.733768] env[68964]: DEBUG oslo_concurrency.lockutils [req-2dd37984-373a-4455-9007-54a27d9e8453 req-4b9fa58d-5240-4f12-b083-98f6f4b8c38c service nova] Acquired lock "refresh_cache-8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1563.733931] env[68964]: DEBUG nova.network.neutron [req-2dd37984-373a-4455-9007-54a27d9e8453 req-4b9fa58d-5240-4f12-b083-98f6f4b8c38c service nova] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Refreshing network info cache for port 1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1563.978376] env[68964]: DEBUG nova.network.neutron [req-2dd37984-373a-4455-9007-54a27d9e8453 req-4b9fa58d-5240-4f12-b083-98f6f4b8c38c service nova] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Updated VIF entry in instance network info cache for port 1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1563.978740] env[68964]: DEBUG nova.network.neutron [req-2dd37984-373a-4455-9007-54a27d9e8453 req-4b9fa58d-5240-4f12-b083-98f6f4b8c38c service nova] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Updating instance_info_cache with network_info: [{"id": "1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d", "address": "fa:16:3e:18:ec:b9", "network": {"id": "65c9f526-15b6-4513-82c3-2ef267c8d2c4", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-93366620-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f0e12b3903714e7e87247a8a42cf1a7a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "424fd631-4456-4ce2-8924-a2ed81d60bd6", "external-id": "nsx-vlan-transportzone-19", "segmentation_id": 19, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d7a8a3f-54", "ovs_interfaceid": "1d7a8a3f-5488-40b5-b37a-88c2f8a6d27d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1563.992031] env[68964]: DEBUG oslo_concurrency.lockutils [req-2dd37984-373a-4455-9007-54a27d9e8453 req-4b9fa58d-5240-4f12-b083-98f6f4b8c38c service nova] Releasing lock "refresh_cache-8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1568.724512] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1570.724223] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1570.724535] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1573.719724] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1573.724386] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1574.724560] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1574.724560] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1574.724898] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1574.745881] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 704ec14b-410e-4175-b032-69074b332d87] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.746054] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.746185] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.746315] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.746439] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.746608] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.746750] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.746873] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.746991] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.747129] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1574.747285] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1574.747833] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1574.747974] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1575.725307] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1575.739513] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1575.739760] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1575.739906] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.740068] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1575.741285] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45e8f423-22d5-41ac-8f68-e7140f5a6252 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.750105] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c02bbe5-0493-471d-ba35-8faffed4b986 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.764974] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8ffe866-8525-46a8-a3e2-643a5eeca7f6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.771565] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48b90e96-97fb-464c-8f67-f94ce2477d82 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.801987] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180956MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1575.802162] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1575.802356] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1575.877902] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 704ec14b-410e-4175-b032-69074b332d87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.878080] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.878245] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.878396] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.878526] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.878648] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.878768] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3d41d454-f370-46a6-ba97-17f5553d557c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.878884] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.878998] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.879126] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1575.890337] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1575.900823] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1575.910250] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c3065e70-0bec-4b15-ae2d-fee36304f41e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1575.919844] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1575.920097] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1575.920273] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1576.078908] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79133332-b84c-485f-99ab-68aacad01cc3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.086425] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adffc055-4021-4d3f-b66d-f6df06e8a40a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.117032] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1cf80d9-c4fb-4154-9d7e-5ce99861d13d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.123761] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-021f3a90-5a4f-4d73-9959-fd21650d4f39 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.136503] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1576.144928] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1576.157599] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1576.157774] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.355s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1577.152075] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1577.724164] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1609.006095] env[68964]: WARNING oslo_vmware.rw_handles [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1609.006095] env[68964]: ERROR oslo_vmware.rw_handles [ 1609.006095] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/368fca49-99c9-4afb-95f1-5d62ea17b0f1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1609.008109] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1609.008352] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Copying Virtual Disk [datastore1] vmware_temp/368fca49-99c9-4afb-95f1-5d62ea17b0f1/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/368fca49-99c9-4afb-95f1-5d62ea17b0f1/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1609.009082] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0fe50235-ddab-48ce-b7a9-4cf7d491f0c4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.017928] env[68964]: DEBUG oslo_vmware.api [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 1609.017928] env[68964]: value = "task-3431734" [ 1609.017928] env[68964]: _type = "Task" [ 1609.017928] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1609.025552] env[68964]: DEBUG oslo_vmware.api [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': task-3431734, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1609.528264] env[68964]: DEBUG oslo_vmware.exceptions [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1609.528505] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1609.529063] env[68964]: ERROR nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1609.529063] env[68964]: Faults: ['InvalidArgument'] [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] Traceback (most recent call last): [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] yield resources [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] self.driver.spawn(context, instance, image_meta, [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] self._fetch_image_if_missing(context, vi) [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] image_cache(vi, tmp_image_ds_loc) [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] vm_util.copy_virtual_disk( [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] session._wait_for_task(vmdk_copy_task) [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] return self.wait_for_task(task_ref) [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] return evt.wait() [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] result = hub.switch() [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] return self.greenlet.switch() [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] self.f(*self.args, **self.kw) [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] raise exceptions.translate_fault(task_info.error) [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] Faults: ['InvalidArgument'] [ 1609.529063] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] [ 1609.529948] env[68964]: INFO nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Terminating instance [ 1609.531039] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1609.531147] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1609.531342] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c1d9e6a6-8fbd-46c9-883c-17578ccb0a96 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.533685] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1609.533878] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1609.534609] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89b7240f-5ec1-484a-b00f-0d28c0c402da {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.541209] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1609.541415] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-65bb2604-f120-489d-bf77-7536f7efde4b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.543527] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1609.543699] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1609.544624] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-937ee388-af63-401f-a327-75b81ff6a44e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.549356] env[68964]: DEBUG oslo_vmware.api [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Waiting for the task: (returnval){ [ 1609.549356] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d20b22-c4eb-ae5e-f741-32070cf98814" [ 1609.549356] env[68964]: _type = "Task" [ 1609.549356] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1609.556229] env[68964]: DEBUG oslo_vmware.api [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d20b22-c4eb-ae5e-f741-32070cf98814, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1609.613573] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1609.613815] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1609.613971] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Deleting the datastore file [datastore1] 704ec14b-410e-4175-b032-69074b332d87 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1609.614267] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e73b5a73-56fe-4845-a8d7-fb413275a82f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.620326] env[68964]: DEBUG oslo_vmware.api [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 1609.620326] env[68964]: value = "task-3431736" [ 1609.620326] env[68964]: _type = "Task" [ 1609.620326] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1609.627812] env[68964]: DEBUG oslo_vmware.api [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': task-3431736, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1610.059748] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1610.060078] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Creating directory with path [datastore1] vmware_temp/2f76a336-8897-4cac-a67b-87a344ea7002/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1610.060231] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-93d137af-2baa-4de5-af48-b982de7c92a8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.070737] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Created directory with path [datastore1] vmware_temp/2f76a336-8897-4cac-a67b-87a344ea7002/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1610.070928] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Fetch image to [datastore1] vmware_temp/2f76a336-8897-4cac-a67b-87a344ea7002/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1610.071192] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/2f76a336-8897-4cac-a67b-87a344ea7002/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1610.071859] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a12ce4a1-1e19-41f2-8518-196549c76a63 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.079542] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94266460-07d5-4456-8c38-fa6a15ae176e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.088322] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9381f31-005d-4f80-be0f-88677651548e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.118504] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c361eb5a-cb74-4222-abf5-91f3066596ee {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.129535] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e76417b6-d1be-4eb3-98e0-675c341e103c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.130715] env[68964]: DEBUG oslo_vmware.api [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': task-3431736, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064309} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1610.130945] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1610.131140] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1610.131310] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1610.131479] env[68964]: INFO nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1610.133559] env[68964]: DEBUG nova.compute.claims [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1610.133729] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1610.133920] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1610.156086] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1610.205516] env[68964]: DEBUG oslo_vmware.rw_handles [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f76a336-8897-4cac-a67b-87a344ea7002/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1610.266335] env[68964]: DEBUG oslo_vmware.rw_handles [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1610.266665] env[68964]: DEBUG oslo_vmware.rw_handles [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2f76a336-8897-4cac-a67b-87a344ea7002/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1610.402967] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03b14365-2b11-4a93-8844-e6c1816c9666 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.410459] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acc82331-3e74-4cec-a0de-806125832923 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.441630] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61719f42-e1a2-403b-b93d-d215c7680dfd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.448875] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-183d91d9-f72f-43a2-bbf5-4a3c2192ddc6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.462407] env[68964]: DEBUG nova.compute.provider_tree [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1610.471394] env[68964]: DEBUG nova.scheduler.client.report [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1610.486392] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.352s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1610.487105] env[68964]: ERROR nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1610.487105] env[68964]: Faults: ['InvalidArgument'] [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] Traceback (most recent call last): [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] self.driver.spawn(context, instance, image_meta, [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] self._fetch_image_if_missing(context, vi) [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] image_cache(vi, tmp_image_ds_loc) [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] vm_util.copy_virtual_disk( [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] session._wait_for_task(vmdk_copy_task) [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] return self.wait_for_task(task_ref) [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] return evt.wait() [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] result = hub.switch() [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] return self.greenlet.switch() [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] self.f(*self.args, **self.kw) [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] raise exceptions.translate_fault(task_info.error) [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] Faults: ['InvalidArgument'] [ 1610.487105] env[68964]: ERROR nova.compute.manager [instance: 704ec14b-410e-4175-b032-69074b332d87] [ 1610.488014] env[68964]: DEBUG nova.compute.utils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1610.489279] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Build of instance 704ec14b-410e-4175-b032-69074b332d87 was re-scheduled: A specified parameter was not correct: fileType [ 1610.489279] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1610.489849] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1610.489849] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1610.489970] env[68964]: DEBUG nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1610.490182] env[68964]: DEBUG nova.network.neutron [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1610.897693] env[68964]: DEBUG nova.network.neutron [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1610.909882] env[68964]: INFO nova.compute.manager [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Took 0.42 seconds to deallocate network for instance. [ 1611.009402] env[68964]: INFO nova.scheduler.client.report [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Deleted allocations for instance 704ec14b-410e-4175-b032-69074b332d87 [ 1611.032515] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6929146-c4f6-45ad-a19e-dfb914f45f6a tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "704ec14b-410e-4175-b032-69074b332d87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 524.880s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1611.033924] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "704ec14b-410e-4175-b032-69074b332d87" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 363.411s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1611.034082] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 704ec14b-410e-4175-b032-69074b332d87] During sync_power_state the instance has a pending task (spawning). Skip. [ 1611.034265] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "704ec14b-410e-4175-b032-69074b332d87" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1611.034910] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "704ec14b-410e-4175-b032-69074b332d87" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 328.740s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1611.035098] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "704ec14b-410e-4175-b032-69074b332d87-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1611.035448] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "704ec14b-410e-4175-b032-69074b332d87-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1611.035448] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "704ec14b-410e-4175-b032-69074b332d87-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1611.037605] env[68964]: INFO nova.compute.manager [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Terminating instance [ 1611.039285] env[68964]: DEBUG nova.compute.manager [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1611.039480] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1611.039735] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9c156e4c-e0bd-47f4-a157-f7e2b7650acc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.043458] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1611.050168] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5be6fba5-a874-497b-9a52-ce16e5f2697a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.079503] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 704ec14b-410e-4175-b032-69074b332d87 could not be found. [ 1611.079779] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1611.079823] env[68964]: INFO nova.compute.manager [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 704ec14b-410e-4175-b032-69074b332d87] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1611.080069] env[68964]: DEBUG oslo.service.loopingcall [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1611.082165] env[68964]: DEBUG nova.compute.manager [-] [instance: 704ec14b-410e-4175-b032-69074b332d87] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1611.082270] env[68964]: DEBUG nova.network.neutron [-] [instance: 704ec14b-410e-4175-b032-69074b332d87] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1611.096737] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1611.096974] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1611.098465] env[68964]: INFO nova.compute.claims [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1611.107753] env[68964]: DEBUG nova.network.neutron [-] [instance: 704ec14b-410e-4175-b032-69074b332d87] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1611.123350] env[68964]: INFO nova.compute.manager [-] [instance: 704ec14b-410e-4175-b032-69074b332d87] Took 0.04 seconds to deallocate network for instance. [ 1611.214777] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f7270fe5-f360-4f3c-9374-fc380b35529d tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "704ec14b-410e-4175-b032-69074b332d87" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1611.293493] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-701d5fc8-d15c-437e-92ed-e6dc4b228f78 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.301814] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-106ec0e9-5063-4702-8fe2-d40c1bd83d8d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.331493] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-257e52ab-b48b-4333-8db5-0d7ba919dc5a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.338363] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e969248-4363-4f3a-9687-b9105b781c7b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.350959] env[68964]: DEBUG nova.compute.provider_tree [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1611.360726] env[68964]: DEBUG nova.scheduler.client.report [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1611.372702] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1611.373152] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1611.406685] env[68964]: DEBUG nova.compute.utils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1611.408157] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1611.408465] env[68964]: DEBUG nova.network.neutron [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1611.416383] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1611.472394] env[68964]: DEBUG nova.policy [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae5a60881ac14c52b769561e6f81d6ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6087614d846942ddbd06308568d3f1d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1611.475619] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1611.499789] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1611.500062] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1611.500253] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1611.500440] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1611.500584] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1611.500730] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1611.500933] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1611.501107] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1611.501275] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1611.501436] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1611.501603] env[68964]: DEBUG nova.virt.hardware [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1611.502451] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78d246ef-03e9-49f3-af12-0cc895192a50 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.510314] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e87363-1fa9-4e0f-9231-8db6daa4467e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.858403] env[68964]: DEBUG nova.network.neutron [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Successfully created port: 18e34e4d-cead-460c-9944-ccd6df59543f {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1612.437498] env[68964]: DEBUG nova.network.neutron [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Successfully updated port: 18e34e4d-cead-460c-9944-ccd6df59543f {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1612.448767] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "refresh_cache-2243b807-c2a0-4917-aae8-5de31dc52e53" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1612.448893] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired lock "refresh_cache-2243b807-c2a0-4917-aae8-5de31dc52e53" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1612.449055] env[68964]: DEBUG nova.network.neutron [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1612.485811] env[68964]: DEBUG nova.network.neutron [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1612.632956] env[68964]: DEBUG nova.network.neutron [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Updating instance_info_cache with network_info: [{"id": "18e34e4d-cead-460c-9944-ccd6df59543f", "address": "fa:16:3e:dd:96:35", "network": {"id": "16e206f1-9c4c-4ce3-bd55-aeda5cfaeee7", "bridge": "br-int", "label": "tempest-ServersTestJSON-94782596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6087614d846942ddbd06308568d3f1d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec763be6-4041-4651-8fd7-3820cf0ab86d", "external-id": "nsx-vlan-transportzone-943", "segmentation_id": 943, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap18e34e4d-ce", "ovs_interfaceid": "18e34e4d-cead-460c-9944-ccd6df59543f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1612.643408] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Releasing lock "refresh_cache-2243b807-c2a0-4917-aae8-5de31dc52e53" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1612.643695] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Instance network_info: |[{"id": "18e34e4d-cead-460c-9944-ccd6df59543f", "address": "fa:16:3e:dd:96:35", "network": {"id": "16e206f1-9c4c-4ce3-bd55-aeda5cfaeee7", "bridge": "br-int", "label": "tempest-ServersTestJSON-94782596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6087614d846942ddbd06308568d3f1d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec763be6-4041-4651-8fd7-3820cf0ab86d", "external-id": "nsx-vlan-transportzone-943", "segmentation_id": 943, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap18e34e4d-ce", "ovs_interfaceid": "18e34e4d-cead-460c-9944-ccd6df59543f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1612.644114] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:dd:96:35', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ec763be6-4041-4651-8fd7-3820cf0ab86d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '18e34e4d-cead-460c-9944-ccd6df59543f', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1612.651723] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Creating folder: Project (6087614d846942ddbd06308568d3f1d9). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1612.652250] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-260892f8-5d6c-4cb8-8973-087dff9191d8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.663016] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Created folder: Project (6087614d846942ddbd06308568d3f1d9) in parent group-v684465. [ 1612.663210] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Creating folder: Instances. Parent ref: group-v684591. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1612.663428] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-03b578ba-ca71-4a8c-8083-7c55748d4327 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.672240] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Created folder: Instances in parent group-v684591. [ 1612.672462] env[68964]: DEBUG oslo.service.loopingcall [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1612.672642] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1612.672830] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e56f099f-02d3-4a1b-8ef3-4f66d6d80951 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.691334] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1612.691334] env[68964]: value = "task-3431739" [ 1612.691334] env[68964]: _type = "Task" [ 1612.691334] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1612.698824] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431739, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1612.994282] env[68964]: DEBUG nova.compute.manager [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Received event network-vif-plugged-18e34e4d-cead-460c-9944-ccd6df59543f {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1612.994496] env[68964]: DEBUG oslo_concurrency.lockutils [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] Acquiring lock "2243b807-c2a0-4917-aae8-5de31dc52e53-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1612.994703] env[68964]: DEBUG oslo_concurrency.lockutils [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1612.994872] env[68964]: DEBUG oslo_concurrency.lockutils [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1612.995145] env[68964]: DEBUG nova.compute.manager [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] No waiting events found dispatching network-vif-plugged-18e34e4d-cead-460c-9944-ccd6df59543f {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1612.995349] env[68964]: WARNING nova.compute.manager [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Received unexpected event network-vif-plugged-18e34e4d-cead-460c-9944-ccd6df59543f for instance with vm_state building and task_state spawning. [ 1612.995513] env[68964]: DEBUG nova.compute.manager [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Received event network-changed-18e34e4d-cead-460c-9944-ccd6df59543f {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1612.995668] env[68964]: DEBUG nova.compute.manager [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Refreshing instance network info cache due to event network-changed-18e34e4d-cead-460c-9944-ccd6df59543f. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1612.995857] env[68964]: DEBUG oslo_concurrency.lockutils [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] Acquiring lock "refresh_cache-2243b807-c2a0-4917-aae8-5de31dc52e53" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1612.995995] env[68964]: DEBUG oslo_concurrency.lockutils [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] Acquired lock "refresh_cache-2243b807-c2a0-4917-aae8-5de31dc52e53" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1612.996164] env[68964]: DEBUG nova.network.neutron [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Refreshing network info cache for port 18e34e4d-cead-460c-9944-ccd6df59543f {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1613.202266] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431739, 'name': CreateVM_Task, 'duration_secs': 0.286} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1613.202565] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1613.203551] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1613.203708] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1613.204140] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1613.204278] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-295935af-d8c0-45b2-a7fb-d875d5923b37 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.209521] env[68964]: DEBUG oslo_vmware.api [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for the task: (returnval){ [ 1613.209521] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52c197a5-81f1-c57a-9e86-e83a5746c492" [ 1613.209521] env[68964]: _type = "Task" [ 1613.209521] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1613.217817] env[68964]: DEBUG oslo_vmware.api [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52c197a5-81f1-c57a-9e86-e83a5746c492, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1613.345673] env[68964]: DEBUG nova.network.neutron [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Updated VIF entry in instance network info cache for port 18e34e4d-cead-460c-9944-ccd6df59543f. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1613.346090] env[68964]: DEBUG nova.network.neutron [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Updating instance_info_cache with network_info: [{"id": "18e34e4d-cead-460c-9944-ccd6df59543f", "address": "fa:16:3e:dd:96:35", "network": {"id": "16e206f1-9c4c-4ce3-bd55-aeda5cfaeee7", "bridge": "br-int", "label": "tempest-ServersTestJSON-94782596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6087614d846942ddbd06308568d3f1d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec763be6-4041-4651-8fd7-3820cf0ab86d", "external-id": "nsx-vlan-transportzone-943", "segmentation_id": 943, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap18e34e4d-ce", "ovs_interfaceid": "18e34e4d-cead-460c-9944-ccd6df59543f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1613.358796] env[68964]: DEBUG oslo_concurrency.lockutils [req-baf6b23a-e0d9-483d-8e00-2c4acc5c3ca0 req-0a72e434-2cee-43c0-8c53-f8725018ee73 service nova] Releasing lock "refresh_cache-2243b807-c2a0-4917-aae8-5de31dc52e53" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1613.720644] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1613.720923] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1613.721148] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1614.843744] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1614.844059] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1630.723715] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1631.724224] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1632.724992] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1633.719680] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1633.724278] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1635.725668] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1635.725957] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1635.725957] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1635.747014] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.747181] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.747315] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.747444] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.747568] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.747690] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.747813] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.747932] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.748061] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.748184] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1635.748326] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1636.724778] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1636.724959] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1637.724336] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1637.735670] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1637.735946] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1637.736137] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1637.736323] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1637.737472] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-435b74e4-ba64-4cee-9a85-b9051a39cd9a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1637.746461] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f954252-4078-4575-8c11-88ffa1ac9624 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1637.760795] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4863599a-b614-48f2-b977-07c2bbe7832c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1637.767195] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85a2727f-bbc2-47d9-b22e-8e9e78700782 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1637.796040] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180923MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1637.796203] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1637.796389] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1637.865143] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.865321] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.865448] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.865568] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.865684] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.865802] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3d41d454-f370-46a6-ba97-17f5553d557c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.865916] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.866041] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.866160] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.866277] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1637.878735] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1637.890579] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance c3065e70-0bec-4b15-ae2d-fee36304f41e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1637.900176] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1637.909263] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1637.909479] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1637.909623] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1638.058819] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed6f1eac-3a8f-4024-ae36-958992c8726a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1638.066564] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23954329-b64d-440c-bcdd-37f669f1028f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1638.096622] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33547094-f848-4879-96c9-7e2b7dd2e402 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1638.103560] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6733aaf0-cdff-40c2-9f38-3e8e0e83f7a0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1638.116468] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1638.124777] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1638.137382] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1638.137557] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1639.137926] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1656.748399] env[68964]: WARNING oslo_vmware.rw_handles [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1656.748399] env[68964]: ERROR oslo_vmware.rw_handles [ 1656.749053] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/2f76a336-8897-4cac-a67b-87a344ea7002/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1656.751079] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1656.751366] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Copying Virtual Disk [datastore1] vmware_temp/2f76a336-8897-4cac-a67b-87a344ea7002/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/2f76a336-8897-4cac-a67b-87a344ea7002/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1656.751673] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3a08a94d-7f0a-4637-a378-a252f90a6955 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1656.759592] env[68964]: DEBUG oslo_vmware.api [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Waiting for the task: (returnval){ [ 1656.759592] env[68964]: value = "task-3431740" [ 1656.759592] env[68964]: _type = "Task" [ 1656.759592] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1656.767825] env[68964]: DEBUG oslo_vmware.api [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Task: {'id': task-3431740, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1657.269822] env[68964]: DEBUG oslo_vmware.exceptions [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1657.270146] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1657.270688] env[68964]: ERROR nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1657.270688] env[68964]: Faults: ['InvalidArgument'] [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Traceback (most recent call last): [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] yield resources [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] self.driver.spawn(context, instance, image_meta, [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] self._fetch_image_if_missing(context, vi) [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] image_cache(vi, tmp_image_ds_loc) [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] vm_util.copy_virtual_disk( [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] session._wait_for_task(vmdk_copy_task) [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] return self.wait_for_task(task_ref) [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] return evt.wait() [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] result = hub.switch() [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] return self.greenlet.switch() [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] self.f(*self.args, **self.kw) [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] raise exceptions.translate_fault(task_info.error) [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Faults: ['InvalidArgument'] [ 1657.270688] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] [ 1657.271674] env[68964]: INFO nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Terminating instance [ 1657.272483] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1657.272690] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1657.272930] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-274983fb-b2ce-4d57-833c-58c97e433cc6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.274992] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1657.275207] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1657.275902] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c05d67ae-5695-495e-ad28-b1d216c98f85 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.282908] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1657.283939] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-edd5068a-c5cb-4b3a-8033-a9018d9bf2a5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.285324] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1657.285493] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1657.286158] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ead87293-7d02-43d1-a67d-3e0cae5ff2b0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.291336] env[68964]: DEBUG oslo_vmware.api [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Waiting for the task: (returnval){ [ 1657.291336] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52de4ce7-326b-318f-1d9c-f84ad62d387f" [ 1657.291336] env[68964]: _type = "Task" [ 1657.291336] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1657.298854] env[68964]: DEBUG oslo_vmware.api [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52de4ce7-326b-318f-1d9c-f84ad62d387f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1657.356909] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1657.357236] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1657.357465] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Deleting the datastore file [datastore1] ec476af0-9287-4f82-a4cd-c2a3771f1b68 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1657.357752] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cccaae25-1764-4ff6-93ea-29078145414c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.364149] env[68964]: DEBUG oslo_vmware.api [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Waiting for the task: (returnval){ [ 1657.364149] env[68964]: value = "task-3431742" [ 1657.364149] env[68964]: _type = "Task" [ 1657.364149] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1657.371936] env[68964]: DEBUG oslo_vmware.api [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Task: {'id': task-3431742, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1657.801905] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1657.802183] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Creating directory with path [datastore1] vmware_temp/af69d17c-49c6-4033-adc2-2e5f3a37718a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1657.802426] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-61b8f272-12a0-4f62-9789-b192ebbcf4ca {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.813431] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Created directory with path [datastore1] vmware_temp/af69d17c-49c6-4033-adc2-2e5f3a37718a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1657.813626] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Fetch image to [datastore1] vmware_temp/af69d17c-49c6-4033-adc2-2e5f3a37718a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1657.813803] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/af69d17c-49c6-4033-adc2-2e5f3a37718a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1657.814543] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eee0374c-ea42-4689-9539-3d974386a670 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.821178] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79609890-57f1-496e-9d6e-35c078c9e32d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.830087] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47234edf-e2e5-4d0a-8ebd-ccb59df7fd35 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.862134] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6ca3560-29ac-4ad9-a7ee-90c1360e4f55 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.873291] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a9af6ec7-a107-4779-92ed-118a02d04ac3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.874932] env[68964]: DEBUG oslo_vmware.api [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Task: {'id': task-3431742, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075036} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1657.875180] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1657.875358] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1657.875522] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1657.875694] env[68964]: INFO nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1657.877842] env[68964]: DEBUG nova.compute.claims [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1657.878022] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1657.878238] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1657.896259] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1657.947085] env[68964]: DEBUG oslo_vmware.rw_handles [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/af69d17c-49c6-4033-adc2-2e5f3a37718a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1658.006325] env[68964]: DEBUG oslo_vmware.rw_handles [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1658.006506] env[68964]: DEBUG oslo_vmware.rw_handles [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/af69d17c-49c6-4033-adc2-2e5f3a37718a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1658.128026] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cda76c05-62ed-43f2-9267-491c5bd98f76 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.135322] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-713dd191-78fe-4935-909b-09f84a55ef2d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.164292] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6973504-e37b-4c8d-a3c0-c216bc35406a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.170939] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f9118ea-6011-4255-9e52-a795120ead41 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.183605] env[68964]: DEBUG nova.compute.provider_tree [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1658.192575] env[68964]: DEBUG nova.scheduler.client.report [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1658.205085] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.327s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1658.205601] env[68964]: ERROR nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1658.205601] env[68964]: Faults: ['InvalidArgument'] [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Traceback (most recent call last): [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] self.driver.spawn(context, instance, image_meta, [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] self._fetch_image_if_missing(context, vi) [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] image_cache(vi, tmp_image_ds_loc) [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] vm_util.copy_virtual_disk( [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] session._wait_for_task(vmdk_copy_task) [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] return self.wait_for_task(task_ref) [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] return evt.wait() [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] result = hub.switch() [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] return self.greenlet.switch() [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] self.f(*self.args, **self.kw) [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] raise exceptions.translate_fault(task_info.error) [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Faults: ['InvalidArgument'] [ 1658.205601] env[68964]: ERROR nova.compute.manager [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] [ 1658.206469] env[68964]: DEBUG nova.compute.utils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1658.207615] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Build of instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 was re-scheduled: A specified parameter was not correct: fileType [ 1658.207615] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1658.207968] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1658.208157] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1658.208330] env[68964]: DEBUG nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1658.208487] env[68964]: DEBUG nova.network.neutron [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1658.664877] env[68964]: DEBUG nova.network.neutron [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1658.678008] env[68964]: INFO nova.compute.manager [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Took 0.47 seconds to deallocate network for instance. [ 1658.783207] env[68964]: INFO nova.scheduler.client.report [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Deleted allocations for instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 [ 1658.811773] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6a2b1be2-40d2-43b6-bebf-d439d1687837 tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 541.885s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1658.813176] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 344.844s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1658.813489] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Acquiring lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1658.813726] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1658.813873] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1658.816017] env[68964]: INFO nova.compute.manager [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Terminating instance [ 1658.818974] env[68964]: DEBUG nova.compute.manager [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1658.819588] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1658.820447] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-dd22160a-a87c-4bdc-89db-b6b3f269d9a2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.831512] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ec90d4b-b688-4f98-b75a-b607ae455a96 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.842805] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1658.864282] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ec476af0-9287-4f82-a4cd-c2a3771f1b68 could not be found. [ 1658.864499] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1658.864677] env[68964]: INFO nova.compute.manager [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1658.864922] env[68964]: DEBUG oslo.service.loopingcall [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1658.865175] env[68964]: DEBUG nova.compute.manager [-] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1658.865274] env[68964]: DEBUG nova.network.neutron [-] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1658.887867] env[68964]: DEBUG nova.network.neutron [-] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1658.891595] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1658.891824] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1658.893224] env[68964]: INFO nova.compute.claims [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1658.896090] env[68964]: INFO nova.compute.manager [-] [instance: ec476af0-9287-4f82-a4cd-c2a3771f1b68] Took 0.03 seconds to deallocate network for instance. [ 1658.985289] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1ff3886f-4177-45bc-8d9d-3201377e047e tempest-AttachInterfacesTestJSON-1409797562 tempest-AttachInterfacesTestJSON-1409797562-project-member] Lock "ec476af0-9287-4f82-a4cd-c2a3771f1b68" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.172s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1659.093826] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e464016e-9079-4cdb-89df-1c015d0de93d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.101108] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f9461c1-8387-4d4c-847f-dc1ef3289bc8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.131500] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86c233ae-7f69-4e99-b38c-19e09af125b9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.138231] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce27de01-0e28-41f1-8907-a5fabb062ada {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.151031] env[68964]: DEBUG nova.compute.provider_tree [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1659.159543] env[68964]: DEBUG nova.scheduler.client.report [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1659.175275] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1659.175738] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1659.208912] env[68964]: DEBUG nova.compute.utils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1659.210198] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1659.210373] env[68964]: DEBUG nova.network.neutron [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1659.220413] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1659.267079] env[68964]: DEBUG nova.policy [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cdf2567a5f234d3ca11c17b2a6c50dab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3159a58c1d23417eb9c756a88435d17e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1659.280826] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1659.307225] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1659.307225] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1659.307225] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1659.307944] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1659.307944] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1659.308053] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1659.308310] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1659.308468] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1659.308674] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1659.308840] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1659.309019] env[68964]: DEBUG nova.virt.hardware [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1659.309894] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ac6198a-122f-4ffa-b876-c3452a0cfe5d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.318150] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbf24475-286c-4837-8ab2-05719daaa686 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.556422] env[68964]: DEBUG nova.network.neutron [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Successfully created port: 572a74c7-7ffd-4cf8-86c2-b693a5deeee3 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1660.223835] env[68964]: DEBUG nova.network.neutron [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Successfully updated port: 572a74c7-7ffd-4cf8-86c2-b693a5deeee3 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1660.234588] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "refresh_cache-8f94d3c8-4674-463d-8829-68a184967183" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1660.234751] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired lock "refresh_cache-8f94d3c8-4674-463d-8829-68a184967183" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1660.234904] env[68964]: DEBUG nova.network.neutron [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1660.272735] env[68964]: DEBUG nova.network.neutron [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1660.438147] env[68964]: DEBUG nova.network.neutron [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Updating instance_info_cache with network_info: [{"id": "572a74c7-7ffd-4cf8-86c2-b693a5deeee3", "address": "fa:16:3e:47:84:06", "network": {"id": "5a19a352-d2c7-4a12-9da0-8d1e2ce3c0b7", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1329520651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3159a58c1d23417eb9c756a88435d17e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap572a74c7-7f", "ovs_interfaceid": "572a74c7-7ffd-4cf8-86c2-b693a5deeee3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1660.448795] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Releasing lock "refresh_cache-8f94d3c8-4674-463d-8829-68a184967183" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1660.449114] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Instance network_info: |[{"id": "572a74c7-7ffd-4cf8-86c2-b693a5deeee3", "address": "fa:16:3e:47:84:06", "network": {"id": "5a19a352-d2c7-4a12-9da0-8d1e2ce3c0b7", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1329520651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3159a58c1d23417eb9c756a88435d17e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap572a74c7-7f", "ovs_interfaceid": "572a74c7-7ffd-4cf8-86c2-b693a5deeee3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1660.449502] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:47:84:06', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '572a74c7-7ffd-4cf8-86c2-b693a5deeee3', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1660.456804] env[68964]: DEBUG oslo.service.loopingcall [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1660.457286] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1660.457506] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f4f494e5-3889-4888-9720-3fedc79abfd2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.478433] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1660.478433] env[68964]: value = "task-3431743" [ 1660.478433] env[68964]: _type = "Task" [ 1660.478433] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1660.485773] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431743, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1660.807987] env[68964]: DEBUG nova.compute.manager [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Received event network-vif-plugged-572a74c7-7ffd-4cf8-86c2-b693a5deeee3 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1660.808219] env[68964]: DEBUG oslo_concurrency.lockutils [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] Acquiring lock "8f94d3c8-4674-463d-8829-68a184967183-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1660.808423] env[68964]: DEBUG oslo_concurrency.lockutils [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] Lock "8f94d3c8-4674-463d-8829-68a184967183-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1660.808585] env[68964]: DEBUG oslo_concurrency.lockutils [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] Lock "8f94d3c8-4674-463d-8829-68a184967183-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1660.808791] env[68964]: DEBUG nova.compute.manager [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] [instance: 8f94d3c8-4674-463d-8829-68a184967183] No waiting events found dispatching network-vif-plugged-572a74c7-7ffd-4cf8-86c2-b693a5deeee3 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1660.808959] env[68964]: WARNING nova.compute.manager [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Received unexpected event network-vif-plugged-572a74c7-7ffd-4cf8-86c2-b693a5deeee3 for instance with vm_state building and task_state spawning. [ 1660.809133] env[68964]: DEBUG nova.compute.manager [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Received event network-changed-572a74c7-7ffd-4cf8-86c2-b693a5deeee3 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1660.809277] env[68964]: DEBUG nova.compute.manager [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Refreshing instance network info cache due to event network-changed-572a74c7-7ffd-4cf8-86c2-b693a5deeee3. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1660.809449] env[68964]: DEBUG oslo_concurrency.lockutils [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] Acquiring lock "refresh_cache-8f94d3c8-4674-463d-8829-68a184967183" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1660.809575] env[68964]: DEBUG oslo_concurrency.lockutils [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] Acquired lock "refresh_cache-8f94d3c8-4674-463d-8829-68a184967183" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1660.809838] env[68964]: DEBUG nova.network.neutron [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Refreshing network info cache for port 572a74c7-7ffd-4cf8-86c2-b693a5deeee3 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1660.988868] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431743, 'name': CreateVM_Task, 'duration_secs': 0.268116} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1660.989055] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1660.989817] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1660.989987] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1660.990318] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1660.990566] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a0be7495-cc0b-4ba8-baac-30966f873b2f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.995202] env[68964]: DEBUG oslo_vmware.api [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 1660.995202] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ae96a0-105c-1307-55c0-fdf8256ac552" [ 1660.995202] env[68964]: _type = "Task" [ 1660.995202] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1661.003391] env[68964]: DEBUG oslo_vmware.api [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ae96a0-105c-1307-55c0-fdf8256ac552, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1661.277430] env[68964]: DEBUG nova.network.neutron [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Updated VIF entry in instance network info cache for port 572a74c7-7ffd-4cf8-86c2-b693a5deeee3. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1661.277786] env[68964]: DEBUG nova.network.neutron [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Updating instance_info_cache with network_info: [{"id": "572a74c7-7ffd-4cf8-86c2-b693a5deeee3", "address": "fa:16:3e:47:84:06", "network": {"id": "5a19a352-d2c7-4a12-9da0-8d1e2ce3c0b7", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1329520651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3159a58c1d23417eb9c756a88435d17e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap572a74c7-7f", "ovs_interfaceid": "572a74c7-7ffd-4cf8-86c2-b693a5deeee3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1661.287787] env[68964]: DEBUG oslo_concurrency.lockutils [req-a167ef75-563f-4794-b9e6-1919e7c8d551 req-f4ffec50-f348-42b3-b07d-44bd182845a9 service nova] Releasing lock "refresh_cache-8f94d3c8-4674-463d-8829-68a184967183" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1661.505946] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1661.505946] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1661.505946] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1664.663414] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquiring lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1665.859403] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "2243b807-c2a0-4917-aae8-5de31dc52e53" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1674.397309] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "8f94d3c8-4674-463d-8829-68a184967183" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1689.789389] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1689.789727] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1692.723911] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1692.724246] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1693.720733] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1693.724347] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1695.724938] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1696.724900] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1696.724900] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1696.725284] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1696.747739] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.747903] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.748031] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.748165] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.748325] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.748407] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.748515] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.748885] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.748885] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.748885] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1696.748997] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1697.724544] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1697.724695] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1697.724887] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1697.736365] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1697.736893] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1697.736893] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1697.736893] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1697.738062] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79786464-aa82-45a8-a8e0-219bd7a1eb42 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.747352] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a4fa335-e6dc-4283-92f8-260bde16ed6c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.761387] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da22f595-a729-40a3-b243-61381a0625f3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.767885] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c23fe4f-a1ec-4beb-be0c-27fc03ffaff4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1697.799810] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180948MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1697.799967] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1697.800183] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1697.876523] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.876687] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.876813] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.876957] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.877151] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3d41d454-f370-46a6-ba97-17f5553d557c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.877290] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.877413] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.877529] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.877643] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.877816] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1697.891595] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1697.910075] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1697.926023] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1697.926023] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1697.926023] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1698.126859] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-763cf63b-fd29-47a7-8ffd-f8c1e83b9bdc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.134644] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d608b8d-34c3-44ea-8f5c-a857bee209d0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.164623] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c0a5f9c-3ea3-42bc-9f83-d84e16a0f66f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.172322] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6df75df-f0f6-493e-afe8-fa30e7579cbd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1698.187799] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1698.200916] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1698.220418] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1698.220661] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.420s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1699.222074] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1699.775033] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquiring lock "a257b05d-fa9a-4d1a-9086-d571e45a5283" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1699.775271] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "a257b05d-fa9a-4d1a-9086-d571e45a5283" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1700.720088] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1707.207676] env[68964]: WARNING oslo_vmware.rw_handles [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1707.207676] env[68964]: ERROR oslo_vmware.rw_handles [ 1707.208343] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/af69d17c-49c6-4033-adc2-2e5f3a37718a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1707.211030] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1707.211282] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Copying Virtual Disk [datastore1] vmware_temp/af69d17c-49c6-4033-adc2-2e5f3a37718a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/af69d17c-49c6-4033-adc2-2e5f3a37718a/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1707.211571] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6cf02dd7-5304-4b56-b43e-aa6404009fb3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.219205] env[68964]: DEBUG oslo_vmware.api [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Waiting for the task: (returnval){ [ 1707.219205] env[68964]: value = "task-3431744" [ 1707.219205] env[68964]: _type = "Task" [ 1707.219205] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1707.227261] env[68964]: DEBUG oslo_vmware.api [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Task: {'id': task-3431744, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1707.729268] env[68964]: DEBUG oslo_vmware.exceptions [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1707.729572] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1707.730149] env[68964]: ERROR nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1707.730149] env[68964]: Faults: ['InvalidArgument'] [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Traceback (most recent call last): [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] yield resources [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] self.driver.spawn(context, instance, image_meta, [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] self._fetch_image_if_missing(context, vi) [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] image_cache(vi, tmp_image_ds_loc) [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] vm_util.copy_virtual_disk( [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] session._wait_for_task(vmdk_copy_task) [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] return self.wait_for_task(task_ref) [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] return evt.wait() [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] result = hub.switch() [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] return self.greenlet.switch() [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] self.f(*self.args, **self.kw) [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] raise exceptions.translate_fault(task_info.error) [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Faults: ['InvalidArgument'] [ 1707.730149] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] [ 1707.731257] env[68964]: INFO nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Terminating instance [ 1707.731986] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1707.732208] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1707.733050] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7674013a-1170-4220-ba00-6c2ad42dcf62 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.735222] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1707.735424] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1707.736193] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90b7f767-cf49-44d1-90e9-38146b948dc5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.742927] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1707.743195] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-460c5226-3b89-4e32-a9ee-c21334de33cd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.745444] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1707.745616] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1707.746598] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9f698cc5-a781-4c5c-83ee-eab8c0bf4a98 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.751492] env[68964]: DEBUG oslo_vmware.api [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Waiting for the task: (returnval){ [ 1707.751492] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52cdcab5-fbdd-1086-94ab-b778c3992f43" [ 1707.751492] env[68964]: _type = "Task" [ 1707.751492] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1707.758701] env[68964]: DEBUG oslo_vmware.api [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52cdcab5-fbdd-1086-94ab-b778c3992f43, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1707.816648] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1707.816886] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1707.817040] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Deleting the datastore file [datastore1] 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1707.817318] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ff3d5cdf-dd3c-42ae-b040-745b6d507035 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.823143] env[68964]: DEBUG oslo_vmware.api [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Waiting for the task: (returnval){ [ 1707.823143] env[68964]: value = "task-3431746" [ 1707.823143] env[68964]: _type = "Task" [ 1707.823143] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1707.830545] env[68964]: DEBUG oslo_vmware.api [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Task: {'id': task-3431746, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1708.262059] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1708.262378] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Creating directory with path [datastore1] vmware_temp/d61c993a-2af2-426b-b7de-d15f5d87555f/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1708.262573] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-db7bae34-6081-423c-a496-8d694308b0a0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.275254] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Created directory with path [datastore1] vmware_temp/d61c993a-2af2-426b-b7de-d15f5d87555f/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1708.275607] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Fetch image to [datastore1] vmware_temp/d61c993a-2af2-426b-b7de-d15f5d87555f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1708.275923] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/d61c993a-2af2-426b-b7de-d15f5d87555f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1708.277159] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f750fa7-2ced-4146-ad87-3240e17973b6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.289075] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd0e82e0-c298-4dc5-a396-851e929e5408 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.304022] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c41eacf5-0b14-4606-91d1-16b62b6e5512 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.337607] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64d35c4c-724d-4ebd-b5bc-7717bb331f3d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.344285] env[68964]: DEBUG oslo_vmware.api [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Task: {'id': task-3431746, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072356} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1708.345666] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1708.345856] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1708.346035] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1708.346212] env[68964]: INFO nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1708.347905] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2d15bc1a-43f7-47ce-a19e-10a3031d96dd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.349733] env[68964]: DEBUG nova.compute.claims [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1708.349891] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1708.350114] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.372365] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1708.526158] env[68964]: DEBUG oslo_vmware.rw_handles [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d61c993a-2af2-426b-b7de-d15f5d87555f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1708.585763] env[68964]: DEBUG oslo_vmware.rw_handles [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1708.585948] env[68964]: DEBUG oslo_vmware.rw_handles [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d61c993a-2af2-426b-b7de-d15f5d87555f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1708.634561] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-742f73a2-6a9d-48de-9808-d2fc879518f7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.642052] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92e02d1e-f873-45dd-96ea-237a3f5ed111 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.671905] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-524a0699-a9d1-47ea-86d0-e37e4efb3fd9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.678529] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b186a16-6ec9-4aa7-9ab5-82b471add531 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.691197] env[68964]: DEBUG nova.compute.provider_tree [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1708.699387] env[68964]: DEBUG nova.scheduler.client.report [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1708.712885] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.363s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1708.713400] env[68964]: ERROR nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1708.713400] env[68964]: Faults: ['InvalidArgument'] [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Traceback (most recent call last): [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] self.driver.spawn(context, instance, image_meta, [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] self._fetch_image_if_missing(context, vi) [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] image_cache(vi, tmp_image_ds_loc) [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] vm_util.copy_virtual_disk( [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] session._wait_for_task(vmdk_copy_task) [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] return self.wait_for_task(task_ref) [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] return evt.wait() [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] result = hub.switch() [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] return self.greenlet.switch() [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] self.f(*self.args, **self.kw) [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] raise exceptions.translate_fault(task_info.error) [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Faults: ['InvalidArgument'] [ 1708.713400] env[68964]: ERROR nova.compute.manager [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] [ 1708.714236] env[68964]: DEBUG nova.compute.utils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1708.715392] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Build of instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb was re-scheduled: A specified parameter was not correct: fileType [ 1708.715392] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1708.715750] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1708.715918] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1708.716095] env[68964]: DEBUG nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1708.716285] env[68964]: DEBUG nova.network.neutron [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1709.014215] env[68964]: DEBUG nova.network.neutron [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1709.025565] env[68964]: INFO nova.compute.manager [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Took 0.31 seconds to deallocate network for instance. [ 1709.116685] env[68964]: INFO nova.scheduler.client.report [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Deleted allocations for instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb [ 1709.136166] env[68964]: DEBUG oslo_concurrency.lockutils [None req-6cd7bd4b-8637-43ac-bad3-65c0621fca42 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 585.838s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.137295] env[68964]: DEBUG oslo_concurrency.lockutils [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 389.351s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1709.137477] env[68964]: DEBUG oslo_concurrency.lockutils [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Acquiring lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1709.137681] env[68964]: DEBUG oslo_concurrency.lockutils [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1709.137846] env[68964]: DEBUG oslo_concurrency.lockutils [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.139802] env[68964]: INFO nova.compute.manager [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Terminating instance [ 1709.142739] env[68964]: DEBUG nova.compute.manager [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1709.142932] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1709.143221] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7d354cfe-21cb-4ead-8bbb-cc0785cb38f0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.153554] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76763bfb-45a0-41bf-8cdc-2e3fb5650f72 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.163629] env[68964]: DEBUG nova.compute.manager [None req-e878b2ba-c220-4e72-bf52-cac9dce82d18 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: c3065e70-0bec-4b15-ae2d-fee36304f41e] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1709.184999] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb could not be found. [ 1709.185222] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1709.185398] env[68964]: INFO nova.compute.manager [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1709.185636] env[68964]: DEBUG oslo.service.loopingcall [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1709.185851] env[68964]: DEBUG nova.compute.manager [-] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1709.185947] env[68964]: DEBUG nova.network.neutron [-] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1709.188209] env[68964]: DEBUG nova.compute.manager [None req-e878b2ba-c220-4e72-bf52-cac9dce82d18 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] [instance: c3065e70-0bec-4b15-ae2d-fee36304f41e] Instance disappeared before build. {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1709.212350] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e878b2ba-c220-4e72-bf52-cac9dce82d18 tempest-ImagesTestJSON-1338147904 tempest-ImagesTestJSON-1338147904-project-member] Lock "c3065e70-0bec-4b15-ae2d-fee36304f41e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.676s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.212600] env[68964]: DEBUG nova.network.neutron [-] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1709.220102] env[68964]: INFO nova.compute.manager [-] [instance: 1beb5ead-37f6-4dbe-b0c2-bbc5125334eb] Took 0.03 seconds to deallocate network for instance. [ 1709.225049] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1709.278959] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1709.279230] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1709.280778] env[68964]: INFO nova.compute.claims [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1709.323749] env[68964]: DEBUG oslo_concurrency.lockutils [None req-67da04f9-ca9d-4f5a-ade3-285ac28f0922 tempest-ServersNegativeTestJSON-631132434 tempest-ServersNegativeTestJSON-631132434-project-member] Lock "1beb5ead-37f6-4dbe-b0c2-bbc5125334eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.186s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.515727] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51a95b8a-cc72-4d58-a8aa-20e33643863d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.523184] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c8c40b6-8cea-4465-949f-6d476db85751 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.553598] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-118571fa-1345-4543-966e-31a80e64269b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.560759] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e001eff-33ba-4272-9749-aee7e78dcb7e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.576164] env[68964]: DEBUG nova.compute.provider_tree [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1709.584610] env[68964]: DEBUG nova.scheduler.client.report [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1709.599630] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.320s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.600125] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1709.633572] env[68964]: DEBUG nova.compute.utils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1709.635286] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1709.635725] env[68964]: DEBUG nova.network.neutron [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1709.644104] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1709.697893] env[68964]: DEBUG nova.policy [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5c517b354248469bb1d7be2385dc8578', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '77b70ecb389d4139ae594273c411cefb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1709.714027] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1709.741464] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1709.741769] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1709.741933] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1709.742132] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1709.742277] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1709.742419] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1709.742624] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1709.742780] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1709.742941] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1709.743486] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1709.743700] env[68964]: DEBUG nova.virt.hardware [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1709.745146] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd900b0c-22f3-4cce-a248-a3e4f81b93a4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.753141] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-478c1ea1-fb77-456d-a831-bc7a61fdcacc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.986777] env[68964]: DEBUG nova.network.neutron [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Successfully created port: 07f9f86f-23d6-4b2c-a589-37bda4780837 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1710.648222] env[68964]: DEBUG nova.network.neutron [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Successfully updated port: 07f9f86f-23d6-4b2c-a589-37bda4780837 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1710.660658] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquiring lock "refresh_cache-a8d43f08-4cf1-40aa-ad31-2b02b70d6229" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1710.660803] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquired lock "refresh_cache-a8d43f08-4cf1-40aa-ad31-2b02b70d6229" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1710.660958] env[68964]: DEBUG nova.network.neutron [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1710.714595] env[68964]: DEBUG nova.network.neutron [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1711.111553] env[68964]: DEBUG nova.compute.manager [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Received event network-vif-plugged-07f9f86f-23d6-4b2c-a589-37bda4780837 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1711.111799] env[68964]: DEBUG oslo_concurrency.lockutils [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] Acquiring lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1711.112014] env[68964]: DEBUG oslo_concurrency.lockutils [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1711.112362] env[68964]: DEBUG oslo_concurrency.lockutils [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1711.112362] env[68964]: DEBUG nova.compute.manager [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] No waiting events found dispatching network-vif-plugged-07f9f86f-23d6-4b2c-a589-37bda4780837 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1711.112517] env[68964]: WARNING nova.compute.manager [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Received unexpected event network-vif-plugged-07f9f86f-23d6-4b2c-a589-37bda4780837 for instance with vm_state building and task_state spawning. [ 1711.112672] env[68964]: DEBUG nova.compute.manager [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Received event network-changed-07f9f86f-23d6-4b2c-a589-37bda4780837 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1711.112827] env[68964]: DEBUG nova.compute.manager [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Refreshing instance network info cache due to event network-changed-07f9f86f-23d6-4b2c-a589-37bda4780837. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1711.113095] env[68964]: DEBUG oslo_concurrency.lockutils [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] Acquiring lock "refresh_cache-a8d43f08-4cf1-40aa-ad31-2b02b70d6229" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1711.216208] env[68964]: DEBUG nova.network.neutron [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Updating instance_info_cache with network_info: [{"id": "07f9f86f-23d6-4b2c-a589-37bda4780837", "address": "fa:16:3e:94:f0:53", "network": {"id": "7b9dd980-20e0-4abe-9902-4a42906cc3d6", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-108522489-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "77b70ecb389d4139ae594273c411cefb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4576b9d4-535c-40aa-b078-246f671f216e", "external-id": "nsx-vlan-transportzone-27", "segmentation_id": 27, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07f9f86f-23", "ovs_interfaceid": "07f9f86f-23d6-4b2c-a589-37bda4780837", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1711.228595] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Releasing lock "refresh_cache-a8d43f08-4cf1-40aa-ad31-2b02b70d6229" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1711.228893] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Instance network_info: |[{"id": "07f9f86f-23d6-4b2c-a589-37bda4780837", "address": "fa:16:3e:94:f0:53", "network": {"id": "7b9dd980-20e0-4abe-9902-4a42906cc3d6", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-108522489-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "77b70ecb389d4139ae594273c411cefb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4576b9d4-535c-40aa-b078-246f671f216e", "external-id": "nsx-vlan-transportzone-27", "segmentation_id": 27, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07f9f86f-23", "ovs_interfaceid": "07f9f86f-23d6-4b2c-a589-37bda4780837", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1711.229225] env[68964]: DEBUG oslo_concurrency.lockutils [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] Acquired lock "refresh_cache-a8d43f08-4cf1-40aa-ad31-2b02b70d6229" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1711.229399] env[68964]: DEBUG nova.network.neutron [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Refreshing network info cache for port 07f9f86f-23d6-4b2c-a589-37bda4780837 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1711.230662] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:94:f0:53', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4576b9d4-535c-40aa-b078-246f671f216e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '07f9f86f-23d6-4b2c-a589-37bda4780837', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1711.238142] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Creating folder: Project (77b70ecb389d4139ae594273c411cefb). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1711.241450] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-736c39f9-d53d-4595-a009-a32102e5c9a5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.253521] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Created folder: Project (77b70ecb389d4139ae594273c411cefb) in parent group-v684465. [ 1711.254278] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Creating folder: Instances. Parent ref: group-v684595. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1711.254534] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fe9c76bc-67f7-4400-b8cb-d57be83c2f0c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.264038] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Created folder: Instances in parent group-v684595. [ 1711.264366] env[68964]: DEBUG oslo.service.loopingcall [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1711.264612] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1711.264878] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4a0cda7f-db12-4fbc-8acb-e588b5167abf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.289853] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1711.289853] env[68964]: value = "task-3431749" [ 1711.289853] env[68964]: _type = "Task" [ 1711.289853] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1711.297548] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431749, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1711.508177] env[68964]: DEBUG nova.network.neutron [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Updated VIF entry in instance network info cache for port 07f9f86f-23d6-4b2c-a589-37bda4780837. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1711.508567] env[68964]: DEBUG nova.network.neutron [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Updating instance_info_cache with network_info: [{"id": "07f9f86f-23d6-4b2c-a589-37bda4780837", "address": "fa:16:3e:94:f0:53", "network": {"id": "7b9dd980-20e0-4abe-9902-4a42906cc3d6", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-108522489-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "77b70ecb389d4139ae594273c411cefb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4576b9d4-535c-40aa-b078-246f671f216e", "external-id": "nsx-vlan-transportzone-27", "segmentation_id": 27, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07f9f86f-23", "ovs_interfaceid": "07f9f86f-23d6-4b2c-a589-37bda4780837", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1711.520539] env[68964]: DEBUG oslo_concurrency.lockutils [req-609d07d2-a4cd-49f9-9a07-d70e03ccc65c req-ab218ddc-19a6-459e-b39d-0f216c03e69c service nova] Releasing lock "refresh_cache-a8d43f08-4cf1-40aa-ad31-2b02b70d6229" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1711.800088] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431749, 'name': CreateVM_Task, 'duration_secs': 0.266842} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1711.800376] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1711.800927] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1711.801114] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1711.801416] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1711.801661] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3aa1c167-48cd-47fd-9c7c-b6ed4a31c300 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.805847] env[68964]: DEBUG oslo_vmware.api [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Waiting for the task: (returnval){ [ 1711.805847] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]521e5fcc-6992-5540-4b99-6cc0579307b5" [ 1711.805847] env[68964]: _type = "Task" [ 1711.805847] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1711.812929] env[68964]: DEBUG oslo_vmware.api [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]521e5fcc-6992-5540-4b99-6cc0579307b5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1712.317762] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1712.317762] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1712.317762] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1729.993867] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquiring lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1752.730024] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1753.725301] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1753.725573] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1754.054302] env[68964]: WARNING oslo_vmware.rw_handles [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1754.054302] env[68964]: ERROR oslo_vmware.rw_handles [ 1754.055052] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/d61c993a-2af2-426b-b7de-d15f5d87555f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1754.057072] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1754.057314] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Copying Virtual Disk [datastore1] vmware_temp/d61c993a-2af2-426b-b7de-d15f5d87555f/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/d61c993a-2af2-426b-b7de-d15f5d87555f/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1754.057611] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8bff5077-7f7c-4e47-b19c-1ec188f8dc03 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1754.066123] env[68964]: DEBUG oslo_vmware.api [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Waiting for the task: (returnval){ [ 1754.066123] env[68964]: value = "task-3431750" [ 1754.066123] env[68964]: _type = "Task" [ 1754.066123] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1754.074005] env[68964]: DEBUG oslo_vmware.api [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Task: {'id': task-3431750, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1754.577060] env[68964]: DEBUG oslo_vmware.exceptions [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1754.577310] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1754.577865] env[68964]: ERROR nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1754.577865] env[68964]: Faults: ['InvalidArgument'] [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Traceback (most recent call last): [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] yield resources [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] self.driver.spawn(context, instance, image_meta, [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] self._fetch_image_if_missing(context, vi) [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] image_cache(vi, tmp_image_ds_loc) [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] vm_util.copy_virtual_disk( [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] session._wait_for_task(vmdk_copy_task) [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] return self.wait_for_task(task_ref) [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] return evt.wait() [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] result = hub.switch() [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] return self.greenlet.switch() [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] self.f(*self.args, **self.kw) [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] raise exceptions.translate_fault(task_info.error) [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Faults: ['InvalidArgument'] [ 1754.577865] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] [ 1754.579094] env[68964]: INFO nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Terminating instance [ 1754.579724] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1754.579944] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1754.580193] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ef722d9b-4f05-46b6-816a-579caa558404 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1754.582838] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1754.583048] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1754.583763] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1210e087-d1d3-4b2f-b001-7ee3e1cc7934 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1754.590277] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1754.590520] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c8ba711f-0601-4916-940b-d3e2d22b18cd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1754.592618] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1754.592812] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1754.593782] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6d343671-7761-4490-b9f6-1c4fe63c51e6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1754.598250] env[68964]: DEBUG oslo_vmware.api [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Waiting for the task: (returnval){ [ 1754.598250] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]527acee8-2c8b-e7a2-20ef-5787515a534b" [ 1754.598250] env[68964]: _type = "Task" [ 1754.598250] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1754.605168] env[68964]: DEBUG oslo_vmware.api [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]527acee8-2c8b-e7a2-20ef-5787515a534b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1754.655354] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1754.655567] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1754.655704] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Deleting the datastore file [datastore1] 1b41b7f3-3ae4-48ca-aefc-5563060199d5 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1754.655963] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d6ebf4ae-382c-4833-951a-822a8accd080 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1754.661845] env[68964]: DEBUG oslo_vmware.api [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Waiting for the task: (returnval){ [ 1754.661845] env[68964]: value = "task-3431752" [ 1754.661845] env[68964]: _type = "Task" [ 1754.661845] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1754.669569] env[68964]: DEBUG oslo_vmware.api [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Task: {'id': task-3431752, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1755.108194] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1755.108465] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Creating directory with path [datastore1] vmware_temp/7fe864ae-6571-43d6-9d81-97938570145a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1755.108696] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-02436a1f-4cca-4030-9741-380ac3905713 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.120324] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Created directory with path [datastore1] vmware_temp/7fe864ae-6571-43d6-9d81-97938570145a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1755.120324] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Fetch image to [datastore1] vmware_temp/7fe864ae-6571-43d6-9d81-97938570145a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1755.120500] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/7fe864ae-6571-43d6-9d81-97938570145a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1755.121230] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a2615ac-c928-4a23-8ada-6eafa4655f90 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.127824] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f1405bb-1ec9-482d-a24d-8f1392c75aca {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.136736] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ff5b504-5831-45c3-ae52-fffdbb6eb3ab {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.171547] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-288faf57-9f37-431c-9e51-6c8aac6e4b8f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.178197] env[68964]: DEBUG oslo_vmware.api [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Task: {'id': task-3431752, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073526} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1755.179631] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1755.179823] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1755.179991] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1755.180177] env[68964]: INFO nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1755.182173] env[68964]: DEBUG nova.compute.claims [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1755.182340] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1755.182557] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1755.185689] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c5336a07-ba67-4e13-b72a-79a67b1fa48f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.204955] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1755.289815] env[68964]: DEBUG oslo_vmware.rw_handles [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7fe864ae-6571-43d6-9d81-97938570145a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1755.349576] env[68964]: DEBUG oslo_vmware.rw_handles [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1755.349765] env[68964]: DEBUG oslo_vmware.rw_handles [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7fe864ae-6571-43d6-9d81-97938570145a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1755.448413] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0869f62-3574-47de-a309-e5ef518768c4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.456351] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42633f4d-0742-4c50-ae7a-a3f859f5c3df {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.486501] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d91750c-b0a3-4515-bdc0-529a84b886f8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.493483] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-751fc6e7-298a-47db-9927-fff1d981fa5f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.506196] env[68964]: DEBUG nova.compute.provider_tree [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1755.515682] env[68964]: DEBUG nova.scheduler.client.report [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1755.530878] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.348s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1755.531420] env[68964]: ERROR nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1755.531420] env[68964]: Faults: ['InvalidArgument'] [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Traceback (most recent call last): [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] self.driver.spawn(context, instance, image_meta, [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] self._fetch_image_if_missing(context, vi) [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] image_cache(vi, tmp_image_ds_loc) [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] vm_util.copy_virtual_disk( [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] session._wait_for_task(vmdk_copy_task) [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] return self.wait_for_task(task_ref) [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] return evt.wait() [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] result = hub.switch() [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] return self.greenlet.switch() [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] self.f(*self.args, **self.kw) [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] raise exceptions.translate_fault(task_info.error) [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Faults: ['InvalidArgument'] [ 1755.531420] env[68964]: ERROR nova.compute.manager [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] [ 1755.532365] env[68964]: DEBUG nova.compute.utils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1755.533491] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Build of instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 was re-scheduled: A specified parameter was not correct: fileType [ 1755.533491] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1755.533848] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1755.534031] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1755.534210] env[68964]: DEBUG nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1755.534372] env[68964]: DEBUG nova.network.neutron [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1755.719764] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1755.724437] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1755.836744] env[68964]: DEBUG nova.network.neutron [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1755.850679] env[68964]: INFO nova.compute.manager [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Took 0.32 seconds to deallocate network for instance. [ 1755.944655] env[68964]: INFO nova.scheduler.client.report [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Deleted allocations for instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 [ 1755.963480] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c823e096-c954-431d-9c6b-37a1725cb3a7 tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 619.412s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1755.964616] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 423.238s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1755.964841] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Acquiring lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1755.965071] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1755.965245] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1755.967176] env[68964]: INFO nova.compute.manager [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Terminating instance [ 1755.968798] env[68964]: DEBUG nova.compute.manager [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1755.968995] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1755.969533] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8b6c37c5-d976-42cd-b3e5-5ea7c2611061 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.976357] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1755.982980] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-077067af-16d2-42f5-aa30-84f7d0604c42 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.013696] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1b41b7f3-3ae4-48ca-aefc-5563060199d5 could not be found. [ 1756.013905] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1756.014128] env[68964]: INFO nova.compute.manager [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1756.014448] env[68964]: DEBUG oslo.service.loopingcall [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1756.016666] env[68964]: DEBUG nova.compute.manager [-] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1756.016770] env[68964]: DEBUG nova.network.neutron [-] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1756.030503] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1756.030750] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1756.032301] env[68964]: INFO nova.compute.claims [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1756.042594] env[68964]: DEBUG nova.network.neutron [-] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1756.066853] env[68964]: INFO nova.compute.manager [-] [instance: 1b41b7f3-3ae4-48ca-aefc-5563060199d5] Took 0.05 seconds to deallocate network for instance. [ 1756.156438] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b94e880d-b593-427c-9029-9637af777efa tempest-MultipleCreateTestJSON-918373324 tempest-MultipleCreateTestJSON-918373324-project-member] Lock "1b41b7f3-3ae4-48ca-aefc-5563060199d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.192s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1756.238470] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-369bc1d0-4c06-4d62-9b4e-1648c602c735 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.247383] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d98ebe97-ef5f-46d6-9b48-a17a07be8336 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.278026] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7addedf9-3574-4162-9c10-86877f043d54 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.285262] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59a4d57b-aaf0-4b92-9b8a-2c9497866ea5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.298184] env[68964]: DEBUG nova.compute.provider_tree [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1756.307126] env[68964]: DEBUG nova.scheduler.client.report [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1756.321845] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1756.322335] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1756.361978] env[68964]: DEBUG nova.compute.utils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1756.363193] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1756.363358] env[68964]: DEBUG nova.network.neutron [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1756.371519] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1756.424039] env[68964]: DEBUG nova.policy [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a850f3e7307468d9e739fda0ce4fdb3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ed3fb39ffe124bbaae0b10d818a90c2f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1756.438371] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1756.467615] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1756.467866] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1756.468049] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1756.468219] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1756.468591] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1756.468837] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1756.469095] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1756.469792] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1756.469792] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1756.469792] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1756.469792] env[68964]: DEBUG nova.virt.hardware [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1756.470659] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a13c350e-8ccd-45b8-b643-327be69bcc34 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.479513] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55ab0bc3-911b-44db-9a38-d7b8a29e40b3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.724377] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1756.724555] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1756.724680] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1756.744873] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.745107] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.745283] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.745454] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.745619] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.745775] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.745932] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.746111] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.746273] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.746429] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1756.746585] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1756.816923] env[68964]: DEBUG nova.network.neutron [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Successfully created port: cd704cbc-7947-42ca-babc-729612eb9dea {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1757.430569] env[68964]: DEBUG nova.network.neutron [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Successfully updated port: cd704cbc-7947-42ca-babc-729612eb9dea {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1757.443793] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "refresh_cache-f94037f2-5dea-4824-9f2d-0f87684ccdb8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1757.443793] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquired lock "refresh_cache-f94037f2-5dea-4824-9f2d-0f87684ccdb8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1757.443793] env[68964]: DEBUG nova.network.neutron [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1757.491207] env[68964]: DEBUG nova.network.neutron [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1757.724510] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1757.724758] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1757.876629] env[68964]: DEBUG nova.compute.manager [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Received event network-vif-plugged-cd704cbc-7947-42ca-babc-729612eb9dea {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1757.877254] env[68964]: DEBUG oslo_concurrency.lockutils [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] Acquiring lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1757.877608] env[68964]: DEBUG oslo_concurrency.lockutils [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1757.877906] env[68964]: DEBUG oslo_concurrency.lockutils [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1757.878219] env[68964]: DEBUG nova.compute.manager [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] No waiting events found dispatching network-vif-plugged-cd704cbc-7947-42ca-babc-729612eb9dea {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1757.878543] env[68964]: WARNING nova.compute.manager [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Received unexpected event network-vif-plugged-cd704cbc-7947-42ca-babc-729612eb9dea for instance with vm_state building and task_state spawning. [ 1757.878836] env[68964]: DEBUG nova.compute.manager [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Received event network-changed-cd704cbc-7947-42ca-babc-729612eb9dea {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1757.879136] env[68964]: DEBUG nova.compute.manager [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Refreshing instance network info cache due to event network-changed-cd704cbc-7947-42ca-babc-729612eb9dea. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1757.879436] env[68964]: DEBUG oslo_concurrency.lockutils [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] Acquiring lock "refresh_cache-f94037f2-5dea-4824-9f2d-0f87684ccdb8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1757.922678] env[68964]: DEBUG nova.network.neutron [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Updating instance_info_cache with network_info: [{"id": "cd704cbc-7947-42ca-babc-729612eb9dea", "address": "fa:16:3e:65:50:78", "network": {"id": "8269932d-431e-40c7-a163-4ac9eb02d711", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2129165268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ed3fb39ffe124bbaae0b10d818a90c2f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "99be9a5e-b3f9-4e6c-83d5-df11f817847d", "external-id": "nsx-vlan-transportzone-566", "segmentation_id": 566, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcd704cbc-79", "ovs_interfaceid": "cd704cbc-7947-42ca-babc-729612eb9dea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1757.935061] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Releasing lock "refresh_cache-f94037f2-5dea-4824-9f2d-0f87684ccdb8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1757.935061] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Instance network_info: |[{"id": "cd704cbc-7947-42ca-babc-729612eb9dea", "address": "fa:16:3e:65:50:78", "network": {"id": "8269932d-431e-40c7-a163-4ac9eb02d711", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2129165268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ed3fb39ffe124bbaae0b10d818a90c2f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "99be9a5e-b3f9-4e6c-83d5-df11f817847d", "external-id": "nsx-vlan-transportzone-566", "segmentation_id": 566, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcd704cbc-79", "ovs_interfaceid": "cd704cbc-7947-42ca-babc-729612eb9dea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1757.935285] env[68964]: DEBUG oslo_concurrency.lockutils [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] Acquired lock "refresh_cache-f94037f2-5dea-4824-9f2d-0f87684ccdb8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1757.935285] env[68964]: DEBUG nova.network.neutron [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Refreshing network info cache for port cd704cbc-7947-42ca-babc-729612eb9dea {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1757.936346] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:65:50:78', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '99be9a5e-b3f9-4e6c-83d5-df11f817847d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cd704cbc-7947-42ca-babc-729612eb9dea', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1757.944159] env[68964]: DEBUG oslo.service.loopingcall [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1757.945190] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1757.947483] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2380660a-9fcf-472b-bb6d-551a98e0cf23 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1757.967307] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1757.967307] env[68964]: value = "task-3431753" [ 1757.967307] env[68964]: _type = "Task" [ 1757.967307] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1757.974977] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431753, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1758.199077] env[68964]: DEBUG nova.network.neutron [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Updated VIF entry in instance network info cache for port cd704cbc-7947-42ca-babc-729612eb9dea. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1758.199445] env[68964]: DEBUG nova.network.neutron [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Updating instance_info_cache with network_info: [{"id": "cd704cbc-7947-42ca-babc-729612eb9dea", "address": "fa:16:3e:65:50:78", "network": {"id": "8269932d-431e-40c7-a163-4ac9eb02d711", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2129165268-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ed3fb39ffe124bbaae0b10d818a90c2f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "99be9a5e-b3f9-4e6c-83d5-df11f817847d", "external-id": "nsx-vlan-transportzone-566", "segmentation_id": 566, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcd704cbc-79", "ovs_interfaceid": "cd704cbc-7947-42ca-babc-729612eb9dea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1758.209023] env[68964]: DEBUG oslo_concurrency.lockutils [req-08cc885f-eeb0-4d94-8a78-399309c7704c req-6e5b84d0-64f9-463c-ad15-8a5fe0fa313f service nova] Releasing lock "refresh_cache-f94037f2-5dea-4824-9f2d-0f87684ccdb8" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1758.476905] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431753, 'name': CreateVM_Task, 'duration_secs': 0.27149} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1758.477156] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1758.477840] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1758.478011] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1758.478344] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1758.478611] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a544ba45-efc0-4098-abde-f0bab8aa748b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.483222] env[68964]: DEBUG oslo_vmware.api [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for the task: (returnval){ [ 1758.483222] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52f0739d-a81b-1b6e-5d77-6c8418285cd6" [ 1758.483222] env[68964]: _type = "Task" [ 1758.483222] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1758.490939] env[68964]: DEBUG oslo_vmware.api [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52f0739d-a81b-1b6e-5d77-6c8418285cd6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1758.725073] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1758.725769] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1758.737082] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1758.737312] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1758.737480] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1758.737637] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1758.738728] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04009d17-69b5-4b32-ad99-d4f6ae6c784d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.748315] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e4d8ce5-690b-4b32-98f9-02c906b5e352 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.762114] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c360017b-02f0-4161-83cc-bb5027f69994 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.767979] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4445608-5a1b-4e61-add7-7c245d34f1f2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.796259] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180925MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1758.796433] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1758.796577] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1758.867113] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 07ea329b-3934-437a-8b44-57045e86c310 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.867287] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.867415] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3d41d454-f370-46a6-ba97-17f5553d557c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.867536] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.867659] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.867776] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.867892] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.868013] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.868139] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.868252] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1758.879867] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1758.890065] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1758.890276] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1758.890420] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1758.994685] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1758.994974] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1758.995158] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1759.033729] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c696e1a5-7e71-4ba6-8578-2d2738125199 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.040642] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46636a33-6b2f-4a93-8e10-4dca63f4c089 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.070859] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c579b02d-1032-43f9-a1b3-9657431dd34f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.077509] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4939fda6-0d93-4cb1-8ccf-258997ef741a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.090237] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1759.098595] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1759.113412] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1759.113588] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1802.239863] env[68964]: WARNING oslo_vmware.rw_handles [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1802.239863] env[68964]: ERROR oslo_vmware.rw_handles [ 1802.240616] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/7fe864ae-6571-43d6-9d81-97938570145a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1802.242433] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1802.242686] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Copying Virtual Disk [datastore1] vmware_temp/7fe864ae-6571-43d6-9d81-97938570145a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/7fe864ae-6571-43d6-9d81-97938570145a/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1802.242979] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5eae5371-2bfa-4996-8b5a-8121e66ce7c5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.250549] env[68964]: DEBUG oslo_vmware.api [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Waiting for the task: (returnval){ [ 1802.250549] env[68964]: value = "task-3431754" [ 1802.250549] env[68964]: _type = "Task" [ 1802.250549] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1802.258375] env[68964]: DEBUG oslo_vmware.api [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Task: {'id': task-3431754, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1802.761522] env[68964]: DEBUG oslo_vmware.exceptions [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1802.761808] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1802.762394] env[68964]: ERROR nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1802.762394] env[68964]: Faults: ['InvalidArgument'] [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] Traceback (most recent call last): [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] yield resources [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] self.driver.spawn(context, instance, image_meta, [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] self._fetch_image_if_missing(context, vi) [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] image_cache(vi, tmp_image_ds_loc) [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] vm_util.copy_virtual_disk( [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] session._wait_for_task(vmdk_copy_task) [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] return self.wait_for_task(task_ref) [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] return evt.wait() [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] result = hub.switch() [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] return self.greenlet.switch() [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] self.f(*self.args, **self.kw) [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] raise exceptions.translate_fault(task_info.error) [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] Faults: ['InvalidArgument'] [ 1802.762394] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] [ 1802.763589] env[68964]: INFO nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Terminating instance [ 1802.764330] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1802.764543] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1802.764805] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5dbacd79-b04b-411e-b4a0-83eb49cf0824 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.767065] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1802.767231] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1802.767943] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6b951be-1e96-471b-8bc3-96f1dbff56f4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.774478] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1802.774688] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d57356ca-f3ef-4e06-88dc-8fa7481be5dd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.776768] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1802.776938] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1802.777869] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b9612c0a-c5ff-4d42-8f79-244c6bdf1b39 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.782135] env[68964]: DEBUG oslo_vmware.api [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Waiting for the task: (returnval){ [ 1802.782135] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5243b279-7408-1511-bdd7-1f76d039906c" [ 1802.782135] env[68964]: _type = "Task" [ 1802.782135] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1802.790357] env[68964]: DEBUG oslo_vmware.api [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5243b279-7408-1511-bdd7-1f76d039906c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1802.843106] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1802.843890] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1802.843890] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Deleting the datastore file [datastore1] 07ea329b-3934-437a-8b44-57045e86c310 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1802.843890] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ea3ef278-0249-4a9d-bab5-ff5d996b7004 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1802.849685] env[68964]: DEBUG oslo_vmware.api [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Waiting for the task: (returnval){ [ 1802.849685] env[68964]: value = "task-3431756" [ 1802.849685] env[68964]: _type = "Task" [ 1802.849685] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1802.857269] env[68964]: DEBUG oslo_vmware.api [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Task: {'id': task-3431756, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1803.292644] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1803.292931] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Creating directory with path [datastore1] vmware_temp/5c46883b-1c1f-46fc-b360-269df6b45603/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1803.293160] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c83d02cf-95e1-4bda-84c6-334ed7fa0667 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.304115] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Created directory with path [datastore1] vmware_temp/5c46883b-1c1f-46fc-b360-269df6b45603/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1803.304306] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Fetch image to [datastore1] vmware_temp/5c46883b-1c1f-46fc-b360-269df6b45603/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1803.304481] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/5c46883b-1c1f-46fc-b360-269df6b45603/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1803.305199] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd7dc3ae-8c01-4e73-a4f1-ac6432393150 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.311613] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbbfe9a0-c964-422d-b309-7ca5167e7847 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.320421] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-443899f4-570d-47ab-98a6-035ee40ad567 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.350440] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af530ee2-128d-4083-bfba-5d48f26aaed0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.361183] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7e087c97-307a-44d0-9b09-698e2a6b6c23 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.362860] env[68964]: DEBUG oslo_vmware.api [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Task: {'id': task-3431756, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074006} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1803.363108] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1803.363291] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1803.363460] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1803.363630] env[68964]: INFO nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1803.365707] env[68964]: DEBUG nova.compute.claims [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1803.365874] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1803.366093] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1803.385024] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1803.437741] env[68964]: DEBUG oslo_vmware.rw_handles [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5c46883b-1c1f-46fc-b360-269df6b45603/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1803.495941] env[68964]: DEBUG oslo_vmware.rw_handles [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1803.496131] env[68964]: DEBUG oslo_vmware.rw_handles [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5c46883b-1c1f-46fc-b360-269df6b45603/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1803.607527] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc311426-ecc0-42fa-b4fe-de18a4fa8008 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.618771] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b86eb065-a9d9-4b5b-88de-653f1dac6d8b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.650852] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f074fe2-0b36-44a2-a315-930b45e57d76 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.658430] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bbb238c-4f1d-45ab-87bf-b41cbc0cfd0d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1803.672392] env[68964]: DEBUG nova.compute.provider_tree [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1803.681016] env[68964]: DEBUG nova.scheduler.client.report [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1803.697028] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.330s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1803.697028] env[68964]: ERROR nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1803.697028] env[68964]: Faults: ['InvalidArgument'] [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] Traceback (most recent call last): [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] self.driver.spawn(context, instance, image_meta, [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] self._fetch_image_if_missing(context, vi) [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] image_cache(vi, tmp_image_ds_loc) [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] vm_util.copy_virtual_disk( [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] session._wait_for_task(vmdk_copy_task) [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] return self.wait_for_task(task_ref) [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] return evt.wait() [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] result = hub.switch() [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] return self.greenlet.switch() [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] self.f(*self.args, **self.kw) [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] raise exceptions.translate_fault(task_info.error) [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] Faults: ['InvalidArgument'] [ 1803.697028] env[68964]: ERROR nova.compute.manager [instance: 07ea329b-3934-437a-8b44-57045e86c310] [ 1803.698461] env[68964]: DEBUG nova.compute.utils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1803.699050] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Build of instance 07ea329b-3934-437a-8b44-57045e86c310 was re-scheduled: A specified parameter was not correct: fileType [ 1803.699050] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1803.699464] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1803.699642] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1803.699844] env[68964]: DEBUG nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1803.699974] env[68964]: DEBUG nova.network.neutron [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1804.086514] env[68964]: DEBUG nova.network.neutron [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1804.096599] env[68964]: INFO nova.compute.manager [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Took 0.40 seconds to deallocate network for instance. [ 1804.195242] env[68964]: INFO nova.scheduler.client.report [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Deleted allocations for instance 07ea329b-3934-437a-8b44-57045e86c310 [ 1804.223283] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d9385b6d-0bc9-4af4-93e7-da9a9bbd4c25 tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "07ea329b-3934-437a-8b44-57045e86c310" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 635.983s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1804.224593] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "07ea329b-3934-437a-8b44-57045e86c310" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 439.681s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1804.224806] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Acquiring lock "07ea329b-3934-437a-8b44-57045e86c310-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.225020] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "07ea329b-3934-437a-8b44-57045e86c310-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1804.226310] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "07ea329b-3934-437a-8b44-57045e86c310-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1804.228277] env[68964]: INFO nova.compute.manager [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Terminating instance [ 1804.232623] env[68964]: DEBUG nova.compute.manager [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1804.232824] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1804.233127] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1d07efd6-bd4b-4020-aad4-bb2aa77992cc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.243769] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ce20bd6-dd59-4865-9fb4-56f0e7727b58 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.283652] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 07ea329b-3934-437a-8b44-57045e86c310 could not be found. [ 1804.284057] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1804.284373] env[68964]: INFO nova.compute.manager [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1804.284738] env[68964]: DEBUG oslo.service.loopingcall [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1804.285114] env[68964]: DEBUG nova.compute.manager [-] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1804.285950] env[68964]: DEBUG nova.network.neutron [-] [instance: 07ea329b-3934-437a-8b44-57045e86c310] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1804.287903] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1804.324773] env[68964]: DEBUG nova.network.neutron [-] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1804.352983] env[68964]: INFO nova.compute.manager [-] [instance: 07ea329b-3934-437a-8b44-57045e86c310] Took 0.07 seconds to deallocate network for instance. [ 1804.371409] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1804.371702] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1804.373421] env[68964]: INFO nova.compute.claims [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1804.448453] env[68964]: DEBUG oslo_concurrency.lockutils [None req-c899cc78-5272-497f-8f60-35a08cb5a3ad tempest-ServerTagsTestJSON-362419479 tempest-ServerTagsTestJSON-362419479-project-member] Lock "07ea329b-3934-437a-8b44-57045e86c310" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.224s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1804.550075] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae23e118-a712-4686-aeeb-24fc039a05db {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.557480] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b511c3c0-0549-4d95-a473-b491ce5f9156 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.587943] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3c8c2ae-a8e0-420d-a933-a21b9869d060 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.595798] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-117fc579-1a68-47dd-b05a-90b976169739 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.610489] env[68964]: DEBUG nova.compute.provider_tree [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1804.620032] env[68964]: DEBUG nova.scheduler.client.report [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1804.634979] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1804.635458] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1804.666162] env[68964]: DEBUG nova.compute.utils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1804.667570] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1804.667747] env[68964]: DEBUG nova.network.neutron [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1804.677029] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1804.725961] env[68964]: DEBUG nova.policy [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21b5b62c1d9a4afc8e26b122ce6de51c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '07b4913b8fef4ee3a0d920bc36fefd18', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1804.741551] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1804.766804] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1804.767058] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1804.767218] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1804.767397] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1804.767545] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1804.767691] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1804.767900] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1804.768067] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1804.768234] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1804.768392] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1804.768562] env[68964]: DEBUG nova.virt.hardware [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1804.769457] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-387ba13e-2ce0-4119-8671-86c646b23d3b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.777552] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11cf3640-117e-469b-9c89-439de6cc5832 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.052860] env[68964]: DEBUG nova.network.neutron [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Successfully created port: 9e4d4d73-39b6-409e-9fe5-14582f8508a4 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1805.872456] env[68964]: DEBUG nova.network.neutron [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Successfully updated port: 9e4d4d73-39b6-409e-9fe5-14582f8508a4 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1805.886640] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "refresh_cache-eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1805.886811] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "refresh_cache-eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1805.886938] env[68964]: DEBUG nova.network.neutron [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1805.923204] env[68964]: DEBUG nova.network.neutron [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1806.079871] env[68964]: DEBUG nova.network.neutron [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Updating instance_info_cache with network_info: [{"id": "9e4d4d73-39b6-409e-9fe5-14582f8508a4", "address": "fa:16:3e:ba:74:bd", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9e4d4d73-39", "ovs_interfaceid": "9e4d4d73-39b6-409e-9fe5-14582f8508a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1806.091207] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "refresh_cache-eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1806.091513] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Instance network_info: |[{"id": "9e4d4d73-39b6-409e-9fe5-14582f8508a4", "address": "fa:16:3e:ba:74:bd", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9e4d4d73-39", "ovs_interfaceid": "9e4d4d73-39b6-409e-9fe5-14582f8508a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1806.091894] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ba:74:bd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92fe29b3-0907-453d-aabb-5559c4bd7c0f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9e4d4d73-39b6-409e-9fe5-14582f8508a4', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1806.099332] env[68964]: DEBUG oslo.service.loopingcall [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1806.100026] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1806.100026] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-30ecdeb7-ce9c-40ec-a461-4cab53163112 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.120350] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1806.120350] env[68964]: value = "task-3431757" [ 1806.120350] env[68964]: _type = "Task" [ 1806.120350] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1806.122531] env[68964]: DEBUG nova.compute.manager [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Received event network-vif-plugged-9e4d4d73-39b6-409e-9fe5-14582f8508a4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1806.122734] env[68964]: DEBUG oslo_concurrency.lockutils [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] Acquiring lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1806.122936] env[68964]: DEBUG oslo_concurrency.lockutils [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1806.123113] env[68964]: DEBUG oslo_concurrency.lockutils [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1806.123282] env[68964]: DEBUG nova.compute.manager [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] No waiting events found dispatching network-vif-plugged-9e4d4d73-39b6-409e-9fe5-14582f8508a4 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1806.123445] env[68964]: WARNING nova.compute.manager [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Received unexpected event network-vif-plugged-9e4d4d73-39b6-409e-9fe5-14582f8508a4 for instance with vm_state building and task_state spawning. [ 1806.123638] env[68964]: DEBUG nova.compute.manager [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Received event network-changed-9e4d4d73-39b6-409e-9fe5-14582f8508a4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1806.123798] env[68964]: DEBUG nova.compute.manager [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Refreshing instance network info cache due to event network-changed-9e4d4d73-39b6-409e-9fe5-14582f8508a4. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1806.123976] env[68964]: DEBUG oslo_concurrency.lockutils [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] Acquiring lock "refresh_cache-eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1806.124121] env[68964]: DEBUG oslo_concurrency.lockutils [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] Acquired lock "refresh_cache-eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1806.124274] env[68964]: DEBUG nova.network.neutron [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Refreshing network info cache for port 9e4d4d73-39b6-409e-9fe5-14582f8508a4 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1806.134658] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431757, 'name': CreateVM_Task} progress is 5%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1806.384574] env[68964]: DEBUG nova.network.neutron [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Updated VIF entry in instance network info cache for port 9e4d4d73-39b6-409e-9fe5-14582f8508a4. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1806.384932] env[68964]: DEBUG nova.network.neutron [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Updating instance_info_cache with network_info: [{"id": "9e4d4d73-39b6-409e-9fe5-14582f8508a4", "address": "fa:16:3e:ba:74:bd", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9e4d4d73-39", "ovs_interfaceid": "9e4d4d73-39b6-409e-9fe5-14582f8508a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1806.395104] env[68964]: DEBUG oslo_concurrency.lockutils [req-07b83fe7-d08a-444b-bdb6-2bc937da2bc7 req-7f6a446e-87b5-43ef-9190-da2d61cf1746 service nova] Releasing lock "refresh_cache-eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1806.631487] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431757, 'name': CreateVM_Task, 'duration_secs': 0.291407} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1806.631746] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1806.632442] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1806.632609] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1806.632915] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1806.633524] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8b9665d4-1ab7-44f2-a1b7-2837b8fbf152 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.637538] env[68964]: DEBUG oslo_vmware.api [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 1806.637538] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52fafaa4-3c7f-76ea-899e-c2f8d12eb60a" [ 1806.637538] env[68964]: _type = "Task" [ 1806.637538] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1806.646704] env[68964]: DEBUG oslo_vmware.api [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52fafaa4-3c7f-76ea-899e-c2f8d12eb60a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1807.148641] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1807.148908] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1807.149128] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1809.724462] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1811.108134] env[68964]: DEBUG oslo_concurrency.lockutils [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1813.731384] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1813.731752] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1813.731847] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances with incomplete migration {{(pid=68964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1815.735743] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1815.735743] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1815.735743] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1816.724891] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1816.725170] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1816.725361] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1816.746697] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.746954] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.746989] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.747113] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.747238] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.747358] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.747476] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.747593] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.747711] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.747825] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1816.747942] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1817.724674] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1817.724674] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1817.724839] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1819.724777] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1820.725875] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1820.738726] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1820.739085] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1820.739364] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1820.739641] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1820.741314] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45cca85e-4a48-4af0-bd70-b67dc4166268 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1820.753783] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5cb64aa-572a-4f8f-9356-2549bf5e9a6f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1820.775635] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25a09a9b-534f-4121-978f-3a3cb8b80dc4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1820.785883] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-952f0355-84ff-403f-9c01-f9f587a06d40 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1820.837640] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180956MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1820.837945] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1820.838298] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1820.973654] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.973822] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 3d41d454-f370-46a6-ba97-17f5553d557c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.973970] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.974092] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.974216] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.974325] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.974446] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.974560] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.974673] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.974784] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1820.987079] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1820.987315] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1820.987559] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1821.117198] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9dd5927-3c15-4a0d-bb84-d4ded2fdcb41 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1821.124537] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-830fe5ae-ae72-488a-8e76-8a2823d0bdbd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1821.154099] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98d504d2-a86f-49b0-b4ca-857be6368818 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1821.160886] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6853bc3-499d-48e3-8219-a243a3592b98 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1821.174814] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1821.182757] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1821.198542] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1821.198718] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.361s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1822.193609] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1823.725162] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1823.725478] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1823.734960] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] There are 0 instances to clean {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1851.816059] env[68964]: WARNING oslo_vmware.rw_handles [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1851.816059] env[68964]: ERROR oslo_vmware.rw_handles [ 1851.816729] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/5c46883b-1c1f-46fc-b360-269df6b45603/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1851.818854] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1851.819152] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Copying Virtual Disk [datastore1] vmware_temp/5c46883b-1c1f-46fc-b360-269df6b45603/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/5c46883b-1c1f-46fc-b360-269df6b45603/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1851.819447] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-02a808bf-7ff5-4dd3-bd57-53d6b84c53c2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1851.827513] env[68964]: DEBUG oslo_vmware.api [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Waiting for the task: (returnval){ [ 1851.827513] env[68964]: value = "task-3431758" [ 1851.827513] env[68964]: _type = "Task" [ 1851.827513] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1851.835128] env[68964]: DEBUG oslo_vmware.api [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Task: {'id': task-3431758, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.337757] env[68964]: DEBUG oslo_vmware.exceptions [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1852.338054] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1852.338721] env[68964]: ERROR nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1852.338721] env[68964]: Faults: ['InvalidArgument'] [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Traceback (most recent call last): [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] yield resources [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] self.driver.spawn(context, instance, image_meta, [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] self._fetch_image_if_missing(context, vi) [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] image_cache(vi, tmp_image_ds_loc) [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] vm_util.copy_virtual_disk( [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] session._wait_for_task(vmdk_copy_task) [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] return self.wait_for_task(task_ref) [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] return evt.wait() [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] result = hub.switch() [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] return self.greenlet.switch() [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] self.f(*self.args, **self.kw) [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] raise exceptions.translate_fault(task_info.error) [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Faults: ['InvalidArgument'] [ 1852.338721] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] [ 1852.339855] env[68964]: INFO nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Terminating instance [ 1852.340969] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1852.340969] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1852.340969] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8cff3275-32f2-4986-9339-888e82422025 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.343114] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1852.343306] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1852.344012] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a4f639d-e7e8-4e6f-aea4-ee95b20751ed {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.350540] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1852.350738] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-83510617-a30c-43e5-8231-e7d560a471ec {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.352781] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1852.352952] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1852.353869] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f22578b9-d252-43a9-90cf-ee1a27371395 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.358462] env[68964]: DEBUG oslo_vmware.api [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for the task: (returnval){ [ 1852.358462] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a04c76-6458-cbd8-a8a7-02c32f851227" [ 1852.358462] env[68964]: _type = "Task" [ 1852.358462] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1852.365276] env[68964]: DEBUG oslo_vmware.api [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a04c76-6458-cbd8-a8a7-02c32f851227, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.425433] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1852.425632] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1852.425807] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Deleting the datastore file [datastore1] 3d41d454-f370-46a6-ba97-17f5553d557c {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1852.426078] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0f7cf4bb-5057-422b-b0a2-cb7934da3886 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.432047] env[68964]: DEBUG oslo_vmware.api [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Waiting for the task: (returnval){ [ 1852.432047] env[68964]: value = "task-3431760" [ 1852.432047] env[68964]: _type = "Task" [ 1852.432047] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1852.439864] env[68964]: DEBUG oslo_vmware.api [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Task: {'id': task-3431760, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1852.780905] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_power_states {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.800604] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Getting list of instances from cluster (obj){ [ 1852.800604] env[68964]: value = "domain-c8" [ 1852.800604] env[68964]: _type = "ClusterComputeResource" [ 1852.800604] env[68964]: } {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1852.801858] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c84c428c-3c49-45e0-942d-fbd8f00f84d6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.820104] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Got total of 9 instances {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1852.820499] env[68964]: WARNING nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] While synchronizing instance power states, found 10 instances in the database and 9 instances on the hypervisor. [ 1852.820499] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.820631] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 3d41d454-f370-46a6-ba97-17f5553d557c {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.820739] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid dc39aed1-9371-469b-b43e-40ce313c8ab3 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.820890] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 094b1346-f24b-4360-b7c8-46fd2f2c668f {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.821051] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.821205] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 2243b807-c2a0-4917-aae8-5de31dc52e53 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.821355] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 8f94d3c8-4674-463d-8829-68a184967183 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.821504] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid a8d43f08-4cf1-40aa-ad31-2b02b70d6229 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.821650] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid f94037f2-5dea-4824-9f2d-0f87684ccdb8 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.821793] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1852.822113] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.822369] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "3d41d454-f370-46a6-ba97-17f5553d557c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.822582] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "dc39aed1-9371-469b-b43e-40ce313c8ab3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.822776] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "094b1346-f24b-4360-b7c8-46fd2f2c668f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.822966] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.823185] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "2243b807-c2a0-4917-aae8-5de31dc52e53" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.823402] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "8f94d3c8-4674-463d-8829-68a184967183" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.823601] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.823791] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.823976] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.868192] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1852.868401] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Creating directory with path [datastore1] vmware_temp/5740fece-9562-42ea-91fb-81b603d19e5e/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1852.868845] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cb30462d-416d-4ec5-bf27-89455a02907e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.880235] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Created directory with path [datastore1] vmware_temp/5740fece-9562-42ea-91fb-81b603d19e5e/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1852.880420] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Fetch image to [datastore1] vmware_temp/5740fece-9562-42ea-91fb-81b603d19e5e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1852.880588] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/5740fece-9562-42ea-91fb-81b603d19e5e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1852.881340] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69e444ca-28cf-41b9-b39b-43387b3b9f3a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.888088] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81feb5fa-75e6-4eff-89f3-c90cde1f0bbd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.896995] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49af4ed6-d376-42a7-800a-1a47a2027920 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.927206] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f68adf8b-fff2-414a-9c98-b5b584c9dea1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.935214] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3c8d5f0b-e856-477f-a7a2-c00a8a7f0b6b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1852.941940] env[68964]: DEBUG oslo_vmware.api [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Task: {'id': task-3431760, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.060928} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1852.942219] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1852.942431] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1852.942608] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1852.942781] env[68964]: INFO nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1852.944867] env[68964]: DEBUG nova.compute.claims [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1852.945039] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1852.945251] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1852.956794] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1853.021330] env[68964]: DEBUG oslo_vmware.rw_handles [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5740fece-9562-42ea-91fb-81b603d19e5e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1853.086073] env[68964]: DEBUG oslo_vmware.rw_handles [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1853.086073] env[68964]: DEBUG oslo_vmware.rw_handles [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5740fece-9562-42ea-91fb-81b603d19e5e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1853.189139] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fa2e0b4-b594-4155-aec1-d5c7b707b1cd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.196682] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a47c46bf-c5de-49b5-8faa-00be59dc3473 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.227216] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aa32845-7456-441c-9ae3-5d6cb34dfa28 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.234172] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58a5cccb-ef2d-417e-b4fc-77cdce15a925 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.247189] env[68964]: DEBUG nova.compute.provider_tree [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1853.255854] env[68964]: DEBUG nova.scheduler.client.report [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1853.271020] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.326s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.271544] env[68964]: ERROR nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1853.271544] env[68964]: Faults: ['InvalidArgument'] [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Traceback (most recent call last): [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] self.driver.spawn(context, instance, image_meta, [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] self._fetch_image_if_missing(context, vi) [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] image_cache(vi, tmp_image_ds_loc) [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] vm_util.copy_virtual_disk( [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] session._wait_for_task(vmdk_copy_task) [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] return self.wait_for_task(task_ref) [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] return evt.wait() [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] result = hub.switch() [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] return self.greenlet.switch() [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] self.f(*self.args, **self.kw) [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] raise exceptions.translate_fault(task_info.error) [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Faults: ['InvalidArgument'] [ 1853.271544] env[68964]: ERROR nova.compute.manager [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] [ 1853.272766] env[68964]: DEBUG nova.compute.utils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1853.273758] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Build of instance 3d41d454-f370-46a6-ba97-17f5553d557c was re-scheduled: A specified parameter was not correct: fileType [ 1853.273758] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1853.274136] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1853.274309] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1853.274479] env[68964]: DEBUG nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1853.274662] env[68964]: DEBUG nova.network.neutron [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1853.579361] env[68964]: DEBUG nova.network.neutron [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1853.592055] env[68964]: INFO nova.compute.manager [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Took 0.32 seconds to deallocate network for instance. [ 1853.690106] env[68964]: INFO nova.scheduler.client.report [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Deleted allocations for instance 3d41d454-f370-46a6-ba97-17f5553d557c [ 1853.713745] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0356894-d71a-4bee-aaef-f3332f999591 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "3d41d454-f370-46a6-ba97-17f5553d557c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 562.969s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.714902] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "3d41d454-f370-46a6-ba97-17f5553d557c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 367.434s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.715143] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Acquiring lock "3d41d454-f370-46a6-ba97-17f5553d557c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1853.715403] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "3d41d454-f370-46a6-ba97-17f5553d557c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.715528] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "3d41d454-f370-46a6-ba97-17f5553d557c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.718242] env[68964]: INFO nova.compute.manager [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Terminating instance [ 1853.719547] env[68964]: DEBUG nova.compute.manager [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1853.719749] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1853.720487] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0956cd58-4d8d-4b98-b0d5-907f8b3f8a04 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.726305] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1853.732996] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12a7e9a9-d703-455d-895b-036c96e3c3ab {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.765075] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3d41d454-f370-46a6-ba97-17f5553d557c could not be found. [ 1853.765296] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1853.765472] env[68964]: INFO nova.compute.manager [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1853.765720] env[68964]: DEBUG oslo.service.loopingcall [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1853.770052] env[68964]: DEBUG nova.compute.manager [-] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1853.770187] env[68964]: DEBUG nova.network.neutron [-] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1853.782148] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1853.782321] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.783736] env[68964]: INFO nova.compute.claims [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1853.799841] env[68964]: DEBUG nova.network.neutron [-] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1853.817027] env[68964]: INFO nova.compute.manager [-] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] Took 0.05 seconds to deallocate network for instance. [ 1853.912132] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7b74e368-4d90-4dac-aa09-5819d4e1f0a8 tempest-ServersNegativeTestMultiTenantJSON-2130509384 tempest-ServersNegativeTestMultiTenantJSON-2130509384-project-member] Lock "3d41d454-f370-46a6-ba97-17f5553d557c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.197s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.913680] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "3d41d454-f370-46a6-ba97-17f5553d557c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 1.091s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1853.913895] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 3d41d454-f370-46a6-ba97-17f5553d557c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1853.914100] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "3d41d454-f370-46a6-ba97-17f5553d557c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1853.959125] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c14bf61-5684-4aab-bf9b-e0d5add70ff9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.967032] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1871b315-16a7-44f3-b4c1-030445ab380f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1853.996507] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1759184-3518-437b-9d31-7e515e3f976d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.003044] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bb04a1e-3ac7-4905-bfae-c5a78cc51dc1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.015586] env[68964]: DEBUG nova.compute.provider_tree [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1854.024031] env[68964]: DEBUG nova.scheduler.client.report [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1854.037239] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1854.037693] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1854.068724] env[68964]: DEBUG nova.compute.utils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1854.071415] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1854.071602] env[68964]: DEBUG nova.network.neutron [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1854.079593] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1854.127405] env[68964]: DEBUG nova.policy [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7b6a0df2eb5c4396bc7be80f7d26572d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f8ede73f71584194a54c0248b920dd21', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1854.144682] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1854.172089] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1854.172451] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1854.172669] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1854.172942] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1854.173172] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1854.173384] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1854.173671] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1854.173895] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1854.174142] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1854.174373] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1854.174619] env[68964]: DEBUG nova.virt.hardware [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1854.175795] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9e87566-8f32-4e79-a533-72d232bbaee2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.186621] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f273a06e-72ea-42c6-90b9-31af2049e46f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.448627] env[68964]: DEBUG nova.network.neutron [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Successfully created port: 1dca3b19-a329-4aed-af79-f74236221d49 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1855.074639] env[68964]: DEBUG nova.network.neutron [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Successfully updated port: 1dca3b19-a329-4aed-af79-f74236221d49 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1855.086549] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquiring lock "refresh_cache-a257b05d-fa9a-4d1a-9086-d571e45a5283" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1855.086694] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquired lock "refresh_cache-a257b05d-fa9a-4d1a-9086-d571e45a5283" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1855.087151] env[68964]: DEBUG nova.network.neutron [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1855.125994] env[68964]: DEBUG nova.network.neutron [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1855.283841] env[68964]: DEBUG nova.network.neutron [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Updating instance_info_cache with network_info: [{"id": "1dca3b19-a329-4aed-af79-f74236221d49", "address": "fa:16:3e:38:ef:2b", "network": {"id": "25831504-98f4-4d18-a313-291ca4790b92", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-101665449-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f8ede73f71584194a54c0248b920dd21", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4734e5e-2a76-4bda-8905-70c9bf9e007f", "external-id": "nsx-vlan-transportzone-122", "segmentation_id": 122, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dca3b19-a3", "ovs_interfaceid": "1dca3b19-a329-4aed-af79-f74236221d49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1855.296087] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Releasing lock "refresh_cache-a257b05d-fa9a-4d1a-9086-d571e45a5283" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1855.296396] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Instance network_info: |[{"id": "1dca3b19-a329-4aed-af79-f74236221d49", "address": "fa:16:3e:38:ef:2b", "network": {"id": "25831504-98f4-4d18-a313-291ca4790b92", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-101665449-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f8ede73f71584194a54c0248b920dd21", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4734e5e-2a76-4bda-8905-70c9bf9e007f", "external-id": "nsx-vlan-transportzone-122", "segmentation_id": 122, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dca3b19-a3", "ovs_interfaceid": "1dca3b19-a329-4aed-af79-f74236221d49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1855.296786] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:38:ef:2b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b4734e5e-2a76-4bda-8905-70c9bf9e007f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1dca3b19-a329-4aed-af79-f74236221d49', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1855.304176] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Creating folder: Project (f8ede73f71584194a54c0248b920dd21). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1855.304690] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2e7faa88-af1e-4e0b-95df-e81e032d55de {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.314931] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Created folder: Project (f8ede73f71584194a54c0248b920dd21) in parent group-v684465. [ 1855.315210] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Creating folder: Instances. Parent ref: group-v684600. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1855.315366] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8f58cb0d-99df-4452-b9e9-b231500a8a53 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.324609] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Created folder: Instances in parent group-v684600. [ 1855.324832] env[68964]: DEBUG oslo.service.loopingcall [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1855.325025] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1855.325222] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-269b21eb-e4d0-4add-85eb-a6b56b9bbbd0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.345016] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1855.345016] env[68964]: value = "task-3431763" [ 1855.345016] env[68964]: _type = "Task" [ 1855.345016] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1855.355127] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431763, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1855.619748] env[68964]: DEBUG nova.compute.manager [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Received event network-vif-plugged-1dca3b19-a329-4aed-af79-f74236221d49 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1855.619979] env[68964]: DEBUG oslo_concurrency.lockutils [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] Acquiring lock "a257b05d-fa9a-4d1a-9086-d571e45a5283-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1855.620222] env[68964]: DEBUG oslo_concurrency.lockutils [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] Lock "a257b05d-fa9a-4d1a-9086-d571e45a5283-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1855.620485] env[68964]: DEBUG oslo_concurrency.lockutils [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] Lock "a257b05d-fa9a-4d1a-9086-d571e45a5283-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1855.620661] env[68964]: DEBUG nova.compute.manager [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] No waiting events found dispatching network-vif-plugged-1dca3b19-a329-4aed-af79-f74236221d49 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1855.620822] env[68964]: WARNING nova.compute.manager [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Received unexpected event network-vif-plugged-1dca3b19-a329-4aed-af79-f74236221d49 for instance with vm_state building and task_state spawning. [ 1855.620978] env[68964]: DEBUG nova.compute.manager [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Received event network-changed-1dca3b19-a329-4aed-af79-f74236221d49 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1855.621151] env[68964]: DEBUG nova.compute.manager [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Refreshing instance network info cache due to event network-changed-1dca3b19-a329-4aed-af79-f74236221d49. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1855.621336] env[68964]: DEBUG oslo_concurrency.lockutils [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] Acquiring lock "refresh_cache-a257b05d-fa9a-4d1a-9086-d571e45a5283" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1855.621487] env[68964]: DEBUG oslo_concurrency.lockutils [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] Acquired lock "refresh_cache-a257b05d-fa9a-4d1a-9086-d571e45a5283" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1855.621648] env[68964]: DEBUG nova.network.neutron [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Refreshing network info cache for port 1dca3b19-a329-4aed-af79-f74236221d49 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1855.853131] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431763, 'name': CreateVM_Task, 'duration_secs': 0.273837} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1855.853263] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1855.853981] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1855.854129] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1855.854462] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1855.854706] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d6637a17-3811-4996-947e-5821fb7c8176 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.859014] env[68964]: DEBUG oslo_vmware.api [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Waiting for the task: (returnval){ [ 1855.859014] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ec96a2-5349-1d70-4504-8c26dd6f9460" [ 1855.859014] env[68964]: _type = "Task" [ 1855.859014] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1855.866433] env[68964]: DEBUG oslo_vmware.api [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ec96a2-5349-1d70-4504-8c26dd6f9460, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1855.892358] env[68964]: DEBUG nova.network.neutron [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Updated VIF entry in instance network info cache for port 1dca3b19-a329-4aed-af79-f74236221d49. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1855.892717] env[68964]: DEBUG nova.network.neutron [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Updating instance_info_cache with network_info: [{"id": "1dca3b19-a329-4aed-af79-f74236221d49", "address": "fa:16:3e:38:ef:2b", "network": {"id": "25831504-98f4-4d18-a313-291ca4790b92", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-101665449-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f8ede73f71584194a54c0248b920dd21", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b4734e5e-2a76-4bda-8905-70c9bf9e007f", "external-id": "nsx-vlan-transportzone-122", "segmentation_id": 122, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dca3b19-a3", "ovs_interfaceid": "1dca3b19-a329-4aed-af79-f74236221d49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1855.901723] env[68964]: DEBUG oslo_concurrency.lockutils [req-ad800d05-5b76-4635-99fd-f8806ffc9189 req-3d3595e8-e5b4-4726-aa36-7bf82bffcdfc service nova] Releasing lock "refresh_cache-a257b05d-fa9a-4d1a-9086-d571e45a5283" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1856.370476] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1856.370857] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1856.370950] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1862.815115] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1862.815410] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1871.564022] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "cc90a5a6-19e6-4674-ad06-2c840927409d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1871.564022] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "cc90a5a6-19e6-4674-ad06-2c840927409d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1875.768051] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1876.725037] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1876.725037] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1876.725037] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1876.747201] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.747376] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.747480] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.747606] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.747732] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.747851] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.747967] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.748235] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.748383] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.748504] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1876.748623] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1876.749110] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1876.749298] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1877.724542] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1879.719660] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1879.724333] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1879.724543] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1881.724660] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1882.724183] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1882.735826] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1882.736110] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1882.736230] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1882.736385] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1882.737509] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92469b2b-4299-4f62-9446-729476d9fa21 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.746157] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52b76ab3-f2fc-4e54-a302-4e969328fb03 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.759763] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8a486d1-c156-46a6-b64f-3b52750c8462 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.765947] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-211a757b-ad20-4050-a142-d08352f9ddc3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1882.796327] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180920MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1882.796327] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1882.796483] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1882.866571] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.866748] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.866878] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.867044] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.867141] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.867261] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.867377] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.867493] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.867608] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.867722] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1882.879073] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1882.889136] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1882.889361] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1882.889508] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1882.904970] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Refreshing inventories for resource provider 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1882.918164] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Updating ProviderTree inventory for provider 63b0294e-f555-48a6-a542-3466427066a9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1882.918365] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Updating inventory in ProviderTree for provider 63b0294e-f555-48a6-a542-3466427066a9 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1882.928374] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Refreshing aggregate associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, aggregates: None {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1882.946511] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Refreshing trait associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1883.080497] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-755918bb-1bb6-4dbe-a810-3a8619dcdf5c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.087883] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ed76c83-c4c6-4cbc-8934-e7079cde97db {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.116720] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e38c40e-1123-4fb4-bb04-1f2b544d48a7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.123334] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f04b0a4-c887-4be5-8e78-ec49804a3965 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1883.136604] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1883.145733] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1883.158530] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1883.158700] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.362s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1886.152435] env[68964]: DEBUG oslo_concurrency.lockutils [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1896.168525] env[68964]: DEBUG oslo_concurrency.lockutils [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquiring lock "a257b05d-fa9a-4d1a-9086-d571e45a5283" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1902.837072] env[68964]: WARNING oslo_vmware.rw_handles [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1902.837072] env[68964]: ERROR oslo_vmware.rw_handles [ 1902.837841] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/5740fece-9562-42ea-91fb-81b603d19e5e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1902.839635] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1902.839879] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Copying Virtual Disk [datastore1] vmware_temp/5740fece-9562-42ea-91fb-81b603d19e5e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/5740fece-9562-42ea-91fb-81b603d19e5e/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1902.840179] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5f387dfd-c8b3-4f1e-80ca-04e677d7dfd0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.847983] env[68964]: DEBUG oslo_vmware.api [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for the task: (returnval){ [ 1902.847983] env[68964]: value = "task-3431764" [ 1902.847983] env[68964]: _type = "Task" [ 1902.847983] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1902.855772] env[68964]: DEBUG oslo_vmware.api [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': task-3431764, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1903.357764] env[68964]: DEBUG oslo_vmware.exceptions [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1903.358076] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1903.358661] env[68964]: ERROR nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1903.358661] env[68964]: Faults: ['InvalidArgument'] [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Traceback (most recent call last): [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] yield resources [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] self.driver.spawn(context, instance, image_meta, [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] self._fetch_image_if_missing(context, vi) [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] image_cache(vi, tmp_image_ds_loc) [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] vm_util.copy_virtual_disk( [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] session._wait_for_task(vmdk_copy_task) [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] return self.wait_for_task(task_ref) [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] return evt.wait() [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] result = hub.switch() [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] return self.greenlet.switch() [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] self.f(*self.args, **self.kw) [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] raise exceptions.translate_fault(task_info.error) [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Faults: ['InvalidArgument'] [ 1903.358661] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] [ 1903.360232] env[68964]: INFO nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Terminating instance [ 1903.360534] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1903.360736] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1903.360979] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d7e4d532-b923-466f-8f01-af9b96a35d53 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.363362] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1903.363557] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1903.364293] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9502fb88-d567-4dae-8099-67b845ebbda8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.370618] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1903.370822] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5441d345-3d8a-4edd-8870-c31888e3017b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.373194] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1903.373387] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1903.374044] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c59be6b0-723e-4934-8bf5-9ff4ca464d6e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.378676] env[68964]: DEBUG oslo_vmware.api [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 1903.378676] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52af3c3d-09f5-707d-c132-580aa3da6945" [ 1903.378676] env[68964]: _type = "Task" [ 1903.378676] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1903.385651] env[68964]: DEBUG oslo_vmware.api [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52af3c3d-09f5-707d-c132-580aa3da6945, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1903.434217] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1903.434454] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1903.434609] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Deleting the datastore file [datastore1] b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1903.434887] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-21aea465-dc4e-42db-a1d6-f215194424bb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.440366] env[68964]: DEBUG oslo_vmware.api [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for the task: (returnval){ [ 1903.440366] env[68964]: value = "task-3431766" [ 1903.440366] env[68964]: _type = "Task" [ 1903.440366] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1903.447713] env[68964]: DEBUG oslo_vmware.api [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': task-3431766, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1903.889736] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1903.890071] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating directory with path [datastore1] vmware_temp/9af977d5-4470-4d6d-8aac-2702abd3b542/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1903.890191] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c2e01d1-ec50-4b44-8940-2c5021743d6d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.901518] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Created directory with path [datastore1] vmware_temp/9af977d5-4470-4d6d-8aac-2702abd3b542/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1903.901518] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Fetch image to [datastore1] vmware_temp/9af977d5-4470-4d6d-8aac-2702abd3b542/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1903.901683] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/9af977d5-4470-4d6d-8aac-2702abd3b542/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1903.902380] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eea1861-a39d-4c1b-8324-352b1b6e5567 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.908701] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2db5eed6-9431-4e5a-82d2-2d68d12745fa {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.917405] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e239f650-ad0d-46dd-9252-df2a1eefcabf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.956460] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1674e39-831c-4122-bc49-a90d66dc0b68 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.963689] env[68964]: DEBUG oslo_vmware.api [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': task-3431766, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073646} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1903.965136] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1903.965334] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1903.965509] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1903.965709] env[68964]: INFO nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1903.967458] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-28e9bb3b-2488-4a81-b502-44640163c860 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.969303] env[68964]: DEBUG nova.compute.claims [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1903.969476] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.969684] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1903.991072] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1904.040970] env[68964]: DEBUG oslo_vmware.rw_handles [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9af977d5-4470-4d6d-8aac-2702abd3b542/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1904.101905] env[68964]: DEBUG oslo_vmware.rw_handles [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1904.102367] env[68964]: DEBUG oslo_vmware.rw_handles [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9af977d5-4470-4d6d-8aac-2702abd3b542/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1904.211383] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3b85512-c934-40f5-a648-a9472c959c2c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.219163] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78f3819d-9b8d-42c2-a067-8a78cc9c1b42 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.249684] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a36a0b9b-aea8-44c8-a781-49addcd34220 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.256730] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d76471c5-1e10-4bf7-b4bf-d0f3d478159b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.269530] env[68964]: DEBUG nova.compute.provider_tree [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1904.278246] env[68964]: DEBUG nova.scheduler.client.report [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1904.295617] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.326s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.296230] env[68964]: ERROR nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1904.296230] env[68964]: Faults: ['InvalidArgument'] [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Traceback (most recent call last): [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] self.driver.spawn(context, instance, image_meta, [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] self._fetch_image_if_missing(context, vi) [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] image_cache(vi, tmp_image_ds_loc) [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] vm_util.copy_virtual_disk( [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] session._wait_for_task(vmdk_copy_task) [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] return self.wait_for_task(task_ref) [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] return evt.wait() [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] result = hub.switch() [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] return self.greenlet.switch() [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] self.f(*self.args, **self.kw) [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] raise exceptions.translate_fault(task_info.error) [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Faults: ['InvalidArgument'] [ 1904.296230] env[68964]: ERROR nova.compute.manager [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] [ 1904.297163] env[68964]: DEBUG nova.compute.utils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1904.298653] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Build of instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 was re-scheduled: A specified parameter was not correct: fileType [ 1904.298653] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1904.299021] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1904.299195] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1904.299364] env[68964]: DEBUG nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1904.299525] env[68964]: DEBUG nova.network.neutron [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1904.681188] env[68964]: DEBUG nova.network.neutron [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.697630] env[68964]: INFO nova.compute.manager [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Took 0.40 seconds to deallocate network for instance. [ 1904.798111] env[68964]: INFO nova.scheduler.client.report [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Deleted allocations for instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 [ 1904.820026] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5244b0c7-f204-49cf-85dd-88e14d9c313d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 684.231s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.820430] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 487.984s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.820654] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.820860] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.821045] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.826122] env[68964]: INFO nova.compute.manager [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Terminating instance [ 1904.827956] env[68964]: DEBUG nova.compute.manager [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1904.828146] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1904.828616] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5b4bfb98-dfa0-4312-9538-411068824bf2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.838212] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f97aae4-452e-4389-bf21-2e30933a34cd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.851348] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1904.872401] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3 could not be found. [ 1904.872576] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1904.872754] env[68964]: INFO nova.compute.manager [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1904.872992] env[68964]: DEBUG oslo.service.loopingcall [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1904.873222] env[68964]: DEBUG nova.compute.manager [-] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1904.873331] env[68964]: DEBUG nova.network.neutron [-] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1904.898848] env[68964]: DEBUG nova.network.neutron [-] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.903563] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.903792] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.906564] env[68964]: INFO nova.compute.claims [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1904.909201] env[68964]: INFO nova.compute.manager [-] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] Took 0.04 seconds to deallocate network for instance. [ 1904.994059] env[68964]: DEBUG oslo_concurrency.lockutils [None req-d8e7f7b7-3b1e-4a76-ad59-8d770c13ad74 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.995043] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 52.173s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.995189] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3] During sync_power_state the instance has a pending task (deleting). Skip. [ 1904.995371] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "b8af702e-00e6-4a6d-90c8-d49e4b8bd1d3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1905.082244] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7af0f52e-2672-4b3e-a7cd-7465800e8b4c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.090438] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb7d6ada-cdd8-4833-8831-cbf3f038b44d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.119852] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d32fa9b9-e237-4586-ad6e-98b516cfc8dd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.126541] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4a9cee4-304a-404e-a861-7a24a4e8c36e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.139825] env[68964]: DEBUG nova.compute.provider_tree [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1905.148576] env[68964]: DEBUG nova.scheduler.client.report [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1905.166723] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1905.167319] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1905.199302] env[68964]: DEBUG nova.compute.utils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1905.200727] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1905.201062] env[68964]: DEBUG nova.network.neutron [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1905.209154] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1905.260235] env[68964]: DEBUG nova.policy [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae5a60881ac14c52b769561e6f81d6ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6087614d846942ddbd06308568d3f1d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1905.269152] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1905.293764] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1905.294011] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1905.294175] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1905.294389] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1905.294566] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1905.294718] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1905.294937] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1905.295107] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1905.295280] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1905.295445] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1905.295615] env[68964]: DEBUG nova.virt.hardware [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1905.296506] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93ecf81c-630e-4886-bec2-f7b30ea68f22 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.304311] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9281508e-0367-45e5-9109-350098f56062 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.604265] env[68964]: DEBUG nova.network.neutron [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Successfully created port: b2c6b949-580a-40b5-a27f-39586d5866c4 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1906.263232] env[68964]: DEBUG nova.network.neutron [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Successfully updated port: b2c6b949-580a-40b5-a27f-39586d5866c4 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1906.275157] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "refresh_cache-bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1906.275309] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired lock "refresh_cache-bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1906.275461] env[68964]: DEBUG nova.network.neutron [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1906.315669] env[68964]: DEBUG nova.network.neutron [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1906.474035] env[68964]: DEBUG nova.network.neutron [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Updating instance_info_cache with network_info: [{"id": "b2c6b949-580a-40b5-a27f-39586d5866c4", "address": "fa:16:3e:6c:9c:2a", "network": {"id": "16e206f1-9c4c-4ce3-bd55-aeda5cfaeee7", "bridge": "br-int", "label": "tempest-ServersTestJSON-94782596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6087614d846942ddbd06308568d3f1d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec763be6-4041-4651-8fd7-3820cf0ab86d", "external-id": "nsx-vlan-transportzone-943", "segmentation_id": 943, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2c6b949-58", "ovs_interfaceid": "b2c6b949-580a-40b5-a27f-39586d5866c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1906.486388] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Releasing lock "refresh_cache-bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1906.486652] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Instance network_info: |[{"id": "b2c6b949-580a-40b5-a27f-39586d5866c4", "address": "fa:16:3e:6c:9c:2a", "network": {"id": "16e206f1-9c4c-4ce3-bd55-aeda5cfaeee7", "bridge": "br-int", "label": "tempest-ServersTestJSON-94782596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6087614d846942ddbd06308568d3f1d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec763be6-4041-4651-8fd7-3820cf0ab86d", "external-id": "nsx-vlan-transportzone-943", "segmentation_id": 943, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2c6b949-58", "ovs_interfaceid": "b2c6b949-580a-40b5-a27f-39586d5866c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1906.488210] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6c:9c:2a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ec763be6-4041-4651-8fd7-3820cf0ab86d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b2c6b949-580a-40b5-a27f-39586d5866c4', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1906.494737] env[68964]: DEBUG oslo.service.loopingcall [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1906.495201] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1906.495428] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-90e73ff5-7ee8-4cc3-83f1-ed82dc1ab729 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.515451] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1906.515451] env[68964]: value = "task-3431767" [ 1906.515451] env[68964]: _type = "Task" [ 1906.515451] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1906.523117] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431767, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1906.723639] env[68964]: DEBUG nova.compute.manager [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Received event network-vif-plugged-b2c6b949-580a-40b5-a27f-39586d5866c4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1906.723861] env[68964]: DEBUG oslo_concurrency.lockutils [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] Acquiring lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1906.724077] env[68964]: DEBUG oslo_concurrency.lockutils [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] Lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1906.724251] env[68964]: DEBUG oslo_concurrency.lockutils [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] Lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1906.724435] env[68964]: DEBUG nova.compute.manager [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] No waiting events found dispatching network-vif-plugged-b2c6b949-580a-40b5-a27f-39586d5866c4 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1906.724659] env[68964]: WARNING nova.compute.manager [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Received unexpected event network-vif-plugged-b2c6b949-580a-40b5-a27f-39586d5866c4 for instance with vm_state building and task_state spawning. [ 1906.724832] env[68964]: DEBUG nova.compute.manager [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Received event network-changed-b2c6b949-580a-40b5-a27f-39586d5866c4 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1906.724984] env[68964]: DEBUG nova.compute.manager [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Refreshing instance network info cache due to event network-changed-b2c6b949-580a-40b5-a27f-39586d5866c4. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1906.725260] env[68964]: DEBUG oslo_concurrency.lockutils [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] Acquiring lock "refresh_cache-bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1906.725390] env[68964]: DEBUG oslo_concurrency.lockutils [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] Acquired lock "refresh_cache-bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1906.725545] env[68964]: DEBUG nova.network.neutron [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Refreshing network info cache for port b2c6b949-580a-40b5-a27f-39586d5866c4 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1906.964987] env[68964]: DEBUG nova.network.neutron [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Updated VIF entry in instance network info cache for port b2c6b949-580a-40b5-a27f-39586d5866c4. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1906.965364] env[68964]: DEBUG nova.network.neutron [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Updating instance_info_cache with network_info: [{"id": "b2c6b949-580a-40b5-a27f-39586d5866c4", "address": "fa:16:3e:6c:9c:2a", "network": {"id": "16e206f1-9c4c-4ce3-bd55-aeda5cfaeee7", "bridge": "br-int", "label": "tempest-ServersTestJSON-94782596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6087614d846942ddbd06308568d3f1d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec763be6-4041-4651-8fd7-3820cf0ab86d", "external-id": "nsx-vlan-transportzone-943", "segmentation_id": 943, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2c6b949-58", "ovs_interfaceid": "b2c6b949-580a-40b5-a27f-39586d5866c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1906.975219] env[68964]: DEBUG oslo_concurrency.lockutils [req-ca95bbf7-b55c-4eef-ad29-f7c1d17832ec req-470ae0d1-8f31-4247-87bc-35f1f861cecd service nova] Releasing lock "refresh_cache-bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1907.025636] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431767, 'name': CreateVM_Task, 'duration_secs': 0.2756} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1907.025823] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1907.036174] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1907.036351] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1907.036681] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1907.036936] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ba40cba6-7338-414b-963b-8e7aa3b59cc8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1907.042094] env[68964]: DEBUG oslo_vmware.api [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for the task: (returnval){ [ 1907.042094] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525fb397-b91d-d8ee-261c-8d83b368b3bf" [ 1907.042094] env[68964]: _type = "Task" [ 1907.042094] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1907.049738] env[68964]: DEBUG oslo_vmware.api [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525fb397-b91d-d8ee-261c-8d83b368b3bf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1907.552874] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1907.553244] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1907.553451] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1936.159856] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1936.725396] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1936.725590] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1936.725732] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1936.748175] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.748385] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.748520] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.748650] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.748775] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.748898] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.749028] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.749156] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.749272] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.749387] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1936.749506] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1937.724621] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1938.725937] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1939.725816] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1939.726039] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1939.726410] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1940.720442] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1941.719369] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1943.724533] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1944.724829] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1944.737234] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1944.737465] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1944.737625] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1944.737779] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1944.738912] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57fbac2d-b160-4f10-8eb9-c623a68b8db5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.747982] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cee33b0e-decc-40c9-99a2-203847c8759e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.763043] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-994ae844-6a7a-4ef2-8967-9dc15cccc310 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.769395] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e54b940-cbee-4d22-891a-bca630539636 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.798941] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180900MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1944.799090] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1944.799271] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1944.869097] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.869261] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.869389] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.869509] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.869627] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.869743] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.869857] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.869969] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.870103] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.870219] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1944.880577] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1944.880781] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1944.880924] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1945.001455] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dac032f-083a-49cf-abdf-ae2e198297cd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.008956] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c35ef8cf-7383-4a60-9c24-8c3f355f9e9c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.038785] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a57ec31d-c59d-473c-a586-0346858e2bee {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.045782] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-376ce251-bbce-43b1-af0f-7621bd642534 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.058862] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1945.067204] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1945.080253] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1945.080455] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.281s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.296960] env[68964]: WARNING oslo_vmware.rw_handles [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1952.296960] env[68964]: ERROR oslo_vmware.rw_handles [ 1952.297675] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/9af977d5-4470-4d6d-8aac-2702abd3b542/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1952.299518] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1952.299767] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Copying Virtual Disk [datastore1] vmware_temp/9af977d5-4470-4d6d-8aac-2702abd3b542/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/9af977d5-4470-4d6d-8aac-2702abd3b542/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1952.300067] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b26a5f23-8024-4d9c-b69b-774dba62a3c5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.308738] env[68964]: DEBUG oslo_vmware.api [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 1952.308738] env[68964]: value = "task-3431768" [ 1952.308738] env[68964]: _type = "Task" [ 1952.308738] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1952.316232] env[68964]: DEBUG oslo_vmware.api [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': task-3431768, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1952.820758] env[68964]: DEBUG oslo_vmware.exceptions [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1952.821075] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1952.822260] env[68964]: ERROR nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.822260] env[68964]: Faults: ['InvalidArgument'] [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Traceback (most recent call last): [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] yield resources [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] self.driver.spawn(context, instance, image_meta, [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] self._fetch_image_if_missing(context, vi) [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] image_cache(vi, tmp_image_ds_loc) [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] vm_util.copy_virtual_disk( [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] session._wait_for_task(vmdk_copy_task) [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] return self.wait_for_task(task_ref) [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] return evt.wait() [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] result = hub.switch() [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] return self.greenlet.switch() [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] self.f(*self.args, **self.kw) [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] raise exceptions.translate_fault(task_info.error) [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Faults: ['InvalidArgument'] [ 1952.822260] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] [ 1952.822260] env[68964]: INFO nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Terminating instance [ 1952.823577] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1952.823799] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1952.824045] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8ee889c3-e0c2-475b-85e7-4a40d9af3c91 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.826344] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1952.826540] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1952.827287] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f477781a-0b19-484f-8ec9-05a5393a95c2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.833899] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1952.834124] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-91ee0e92-593d-47db-bae1-ab8d802c2a5c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.836303] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1952.836475] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1952.837413] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-87087594-ca7d-41f4-854b-af1f6e59620d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.841838] env[68964]: DEBUG oslo_vmware.api [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Waiting for the task: (returnval){ [ 1952.841838] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52da7b4f-adf7-c356-f1cf-1462f6dd38ac" [ 1952.841838] env[68964]: _type = "Task" [ 1952.841838] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1952.848746] env[68964]: DEBUG oslo_vmware.api [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52da7b4f-adf7-c356-f1cf-1462f6dd38ac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1952.902070] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1952.902320] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1952.902504] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Deleting the datastore file [datastore1] dc39aed1-9371-469b-b43e-40ce313c8ab3 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1952.902775] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3ce6b528-76dc-4163-8329-1aeeb0801c11 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.908642] env[68964]: DEBUG oslo_vmware.api [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 1952.908642] env[68964]: value = "task-3431770" [ 1952.908642] env[68964]: _type = "Task" [ 1952.908642] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1952.916560] env[68964]: DEBUG oslo_vmware.api [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': task-3431770, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1953.352165] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1953.352442] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Creating directory with path [datastore1] vmware_temp/7f616d1b-1a5d-44ec-a2b6-75fefdd1bde8/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1953.352677] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2adfdc01-32b9-4906-aa4d-403b1118db6c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.379905] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Created directory with path [datastore1] vmware_temp/7f616d1b-1a5d-44ec-a2b6-75fefdd1bde8/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1953.380123] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Fetch image to [datastore1] vmware_temp/7f616d1b-1a5d-44ec-a2b6-75fefdd1bde8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1953.380301] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/7f616d1b-1a5d-44ec-a2b6-75fefdd1bde8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1953.381452] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce94b939-afce-43a7-b13f-196c740719f7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.388015] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b10f5ca5-2fef-4420-adf8-e0b3cf9c3a56 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.396771] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11c5cee0-b52b-43b1-b3a9-f0830cf1ed5f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.429398] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07845f91-a2fa-4b7e-8cec-20be56ae568f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.436900] env[68964]: DEBUG oslo_vmware.api [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': task-3431770, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077532} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1953.438247] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1953.438436] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1953.438649] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1953.438850] env[68964]: INFO nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1953.440568] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-463b7cc2-180b-4151-8f2a-db488fde1750 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.442420] env[68964]: DEBUG nova.compute.claims [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1953.442590] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1953.442797] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1953.463233] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1954.227551] env[68964]: DEBUG oslo_vmware.rw_handles [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7f616d1b-1a5d-44ec-a2b6-75fefdd1bde8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1954.286301] env[68964]: DEBUG oslo_vmware.rw_handles [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1954.286471] env[68964]: DEBUG oslo_vmware.rw_handles [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7f616d1b-1a5d-44ec-a2b6-75fefdd1bde8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1954.357698] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbc49ba6-85ef-493b-96f3-8a3d323a3e05 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.365146] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bd12bc3-a0ed-4130-afd0-d38c12a341e6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.397066] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7853a5ca-776f-4b48-a442-f742348845ce {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.404771] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a931a40-3029-442e-a0ef-6c9c7afed570 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.418588] env[68964]: DEBUG nova.compute.provider_tree [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1954.428363] env[68964]: DEBUG nova.scheduler.client.report [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1954.445177] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 1.002s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1954.445793] env[68964]: ERROR nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1954.445793] env[68964]: Faults: ['InvalidArgument'] [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Traceback (most recent call last): [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] self.driver.spawn(context, instance, image_meta, [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] self._fetch_image_if_missing(context, vi) [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] image_cache(vi, tmp_image_ds_loc) [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] vm_util.copy_virtual_disk( [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] session._wait_for_task(vmdk_copy_task) [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] return self.wait_for_task(task_ref) [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] return evt.wait() [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] result = hub.switch() [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] return self.greenlet.switch() [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] self.f(*self.args, **self.kw) [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] raise exceptions.translate_fault(task_info.error) [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Faults: ['InvalidArgument'] [ 1954.445793] env[68964]: ERROR nova.compute.manager [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] [ 1954.446825] env[68964]: DEBUG nova.compute.utils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1954.448093] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Build of instance dc39aed1-9371-469b-b43e-40ce313c8ab3 was re-scheduled: A specified parameter was not correct: fileType [ 1954.448093] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1954.448455] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1954.448626] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1954.448790] env[68964]: DEBUG nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1954.448950] env[68964]: DEBUG nova.network.neutron [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1954.766446] env[68964]: DEBUG nova.network.neutron [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1954.776923] env[68964]: INFO nova.compute.manager [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Took 0.33 seconds to deallocate network for instance. [ 1954.867727] env[68964]: INFO nova.scheduler.client.report [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Deleted allocations for instance dc39aed1-9371-469b-b43e-40ce313c8ab3 [ 1954.890310] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acdfa437-6cc2-44d3-a662-bc59bc917bba tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 658.515s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1954.891582] env[68964]: DEBUG oslo_concurrency.lockutils [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 462.752s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1954.892251] env[68964]: DEBUG oslo_concurrency.lockutils [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "dc39aed1-9371-469b-b43e-40ce313c8ab3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1954.892251] env[68964]: DEBUG oslo_concurrency.lockutils [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1954.892251] env[68964]: DEBUG oslo_concurrency.lockutils [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1954.896269] env[68964]: INFO nova.compute.manager [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Terminating instance [ 1954.898016] env[68964]: DEBUG nova.compute.manager [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1954.898241] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1954.898536] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-351273fc-fdf6-4091-adc7-9466ea0c5a5d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.902712] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1954.909170] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79d39127-3a6d-4fff-a040-bf38987436c4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.939786] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dc39aed1-9371-469b-b43e-40ce313c8ab3 could not be found. [ 1954.940011] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1954.940201] env[68964]: INFO nova.compute.manager [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1954.940443] env[68964]: DEBUG oslo.service.loopingcall [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1954.944746] env[68964]: DEBUG nova.compute.manager [-] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1954.944855] env[68964]: DEBUG nova.network.neutron [-] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1954.956952] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1954.957207] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1954.958633] env[68964]: INFO nova.compute.claims [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1954.968897] env[68964]: DEBUG nova.network.neutron [-] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1954.976987] env[68964]: INFO nova.compute.manager [-] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] Took 0.03 seconds to deallocate network for instance. [ 1955.060381] env[68964]: DEBUG oslo_concurrency.lockutils [None req-95e1b119-e12a-42a1-8da7-8615cd0148fc tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1955.061429] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 102.239s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1955.061609] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dc39aed1-9371-469b-b43e-40ce313c8ab3] During sync_power_state the instance has a pending task (deleting). Skip. [ 1955.061838] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "dc39aed1-9371-469b-b43e-40ce313c8ab3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1955.123735] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8669c71e-add3-42d6-ba41-d26fb9e05449 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1955.131607] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38768166-7120-4531-b77d-4e699f413ea8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1955.162898] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33da3247-3f69-44db-82ca-610d7d0c5fd1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1955.170807] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cb812b6-210d-4ca3-ae7a-e30c21f25ac3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1955.184912] env[68964]: DEBUG nova.compute.provider_tree [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1955.195101] env[68964]: DEBUG nova.scheduler.client.report [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1955.208739] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1955.208841] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1955.244383] env[68964]: DEBUG nova.compute.utils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1955.245644] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1955.245777] env[68964]: DEBUG nova.network.neutron [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1955.254575] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1955.300820] env[68964]: DEBUG nova.policy [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cdf2567a5f234d3ca11c17b2a6c50dab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3159a58c1d23417eb9c756a88435d17e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 1955.320028] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1955.344758] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1955.344997] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1955.345162] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1955.345397] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1955.345552] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1955.345692] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1955.345891] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1955.346058] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1955.346223] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1955.346386] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1955.346596] env[68964]: DEBUG nova.virt.hardware [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1955.347425] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6c689b3-d5a5-4c51-9a32-457f29d2ae80 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1955.355166] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2de0e26-2064-4ee3-b02b-56fe11d58c5e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1955.657428] env[68964]: DEBUG nova.network.neutron [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Successfully created port: 65ba6eeb-f97f-45bd-8f6b-f5e636a70d56 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1956.261285] env[68964]: DEBUG nova.network.neutron [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Successfully updated port: 65ba6eeb-f97f-45bd-8f6b-f5e636a70d56 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1956.274439] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "refresh_cache-cc90a5a6-19e6-4674-ad06-2c840927409d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1956.274717] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired lock "refresh_cache-cc90a5a6-19e6-4674-ad06-2c840927409d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1956.274929] env[68964]: DEBUG nova.network.neutron [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1956.354720] env[68964]: DEBUG nova.network.neutron [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1956.534626] env[68964]: DEBUG nova.network.neutron [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Updating instance_info_cache with network_info: [{"id": "65ba6eeb-f97f-45bd-8f6b-f5e636a70d56", "address": "fa:16:3e:6c:c5:ef", "network": {"id": "5a19a352-d2c7-4a12-9da0-8d1e2ce3c0b7", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1329520651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3159a58c1d23417eb9c756a88435d17e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap65ba6eeb-f9", "ovs_interfaceid": "65ba6eeb-f97f-45bd-8f6b-f5e636a70d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1956.545184] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Releasing lock "refresh_cache-cc90a5a6-19e6-4674-ad06-2c840927409d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1956.545496] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Instance network_info: |[{"id": "65ba6eeb-f97f-45bd-8f6b-f5e636a70d56", "address": "fa:16:3e:6c:c5:ef", "network": {"id": "5a19a352-d2c7-4a12-9da0-8d1e2ce3c0b7", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1329520651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3159a58c1d23417eb9c756a88435d17e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap65ba6eeb-f9", "ovs_interfaceid": "65ba6eeb-f97f-45bd-8f6b-f5e636a70d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1956.545903] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6c:c5:ef', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '65ba6eeb-f97f-45bd-8f6b-f5e636a70d56', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1956.553207] env[68964]: DEBUG oslo.service.loopingcall [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1956.553659] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1956.553878] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6b164ada-d0a8-45e2-b14f-b021859a94d9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1956.574108] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1956.574108] env[68964]: value = "task-3431771" [ 1956.574108] env[68964]: _type = "Task" [ 1956.574108] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1956.581545] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431771, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1956.831410] env[68964]: DEBUG nova.compute.manager [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Received event network-vif-plugged-65ba6eeb-f97f-45bd-8f6b-f5e636a70d56 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1956.831673] env[68964]: DEBUG oslo_concurrency.lockutils [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] Acquiring lock "cc90a5a6-19e6-4674-ad06-2c840927409d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1956.831860] env[68964]: DEBUG oslo_concurrency.lockutils [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] Lock "cc90a5a6-19e6-4674-ad06-2c840927409d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1956.831913] env[68964]: DEBUG oslo_concurrency.lockutils [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] Lock "cc90a5a6-19e6-4674-ad06-2c840927409d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1956.832077] env[68964]: DEBUG nova.compute.manager [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] No waiting events found dispatching network-vif-plugged-65ba6eeb-f97f-45bd-8f6b-f5e636a70d56 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1956.832268] env[68964]: WARNING nova.compute.manager [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Received unexpected event network-vif-plugged-65ba6eeb-f97f-45bd-8f6b-f5e636a70d56 for instance with vm_state building and task_state spawning. [ 1956.832516] env[68964]: DEBUG nova.compute.manager [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Received event network-changed-65ba6eeb-f97f-45bd-8f6b-f5e636a70d56 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1956.832673] env[68964]: DEBUG nova.compute.manager [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Refreshing instance network info cache due to event network-changed-65ba6eeb-f97f-45bd-8f6b-f5e636a70d56. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1956.832854] env[68964]: DEBUG oslo_concurrency.lockutils [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] Acquiring lock "refresh_cache-cc90a5a6-19e6-4674-ad06-2c840927409d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1956.832987] env[68964]: DEBUG oslo_concurrency.lockutils [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] Acquired lock "refresh_cache-cc90a5a6-19e6-4674-ad06-2c840927409d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1956.833151] env[68964]: DEBUG nova.network.neutron [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Refreshing network info cache for port 65ba6eeb-f97f-45bd-8f6b-f5e636a70d56 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1957.079255] env[68964]: DEBUG nova.network.neutron [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Updated VIF entry in instance network info cache for port 65ba6eeb-f97f-45bd-8f6b-f5e636a70d56. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1957.079596] env[68964]: DEBUG nova.network.neutron [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Updating instance_info_cache with network_info: [{"id": "65ba6eeb-f97f-45bd-8f6b-f5e636a70d56", "address": "fa:16:3e:6c:c5:ef", "network": {"id": "5a19a352-d2c7-4a12-9da0-8d1e2ce3c0b7", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1329520651-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3159a58c1d23417eb9c756a88435d17e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f6ca3b2e-69a5-4cea-96a7-eaad5ec5fd9b", "external-id": "nsx-vlan-transportzone-989", "segmentation_id": 989, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap65ba6eeb-f9", "ovs_interfaceid": "65ba6eeb-f97f-45bd-8f6b-f5e636a70d56", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1957.086805] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431771, 'name': CreateVM_Task, 'duration_secs': 0.275261} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1957.086952] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1957.087556] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1957.087714] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1957.088032] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1957.088278] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4598974-e262-4b77-af61-271590a9d0cf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1957.090561] env[68964]: DEBUG oslo_concurrency.lockutils [req-4dfa00e9-e65d-4723-b787-c3abc403354c req-931302d9-95f8-498d-bf07-042e65382e30 service nova] Releasing lock "refresh_cache-cc90a5a6-19e6-4674-ad06-2c840927409d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1957.093574] env[68964]: DEBUG oslo_vmware.api [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 1957.093574] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52b4111a-f61a-bcfe-03fd-82e50883aac5" [ 1957.093574] env[68964]: _type = "Task" [ 1957.093574] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1957.101547] env[68964]: DEBUG oslo_vmware.api [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52b4111a-f61a-bcfe-03fd-82e50883aac5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1957.603943] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1957.604195] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1957.604437] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1998.081939] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1998.082342] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1998.082342] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1998.107119] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.107306] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.107362] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.107483] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.107609] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.107728] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.107851] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.107963] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.108091] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.108210] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1998.108330] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1998.108831] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1998.724639] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2000.725065] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2000.725552] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2000.725552] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2001.719736] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2001.724534] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2003.355478] env[68964]: WARNING oslo_vmware.rw_handles [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2003.355478] env[68964]: ERROR oslo_vmware.rw_handles [ 2003.356167] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/7f616d1b-1a5d-44ec-a2b6-75fefdd1bde8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2003.358962] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2003.359244] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Copying Virtual Disk [datastore1] vmware_temp/7f616d1b-1a5d-44ec-a2b6-75fefdd1bde8/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/7f616d1b-1a5d-44ec-a2b6-75fefdd1bde8/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2003.359940] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f1239119-c29a-40f2-8daa-87c353ad3499 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.366962] env[68964]: DEBUG oslo_vmware.api [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Waiting for the task: (returnval){ [ 2003.366962] env[68964]: value = "task-3431772" [ 2003.366962] env[68964]: _type = "Task" [ 2003.366962] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2003.374456] env[68964]: DEBUG oslo_vmware.api [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Task: {'id': task-3431772, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2003.724455] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2003.877096] env[68964]: DEBUG oslo_vmware.exceptions [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2003.877472] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2003.878097] env[68964]: ERROR nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2003.878097] env[68964]: Faults: ['InvalidArgument'] [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Traceback (most recent call last): [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] yield resources [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] self.driver.spawn(context, instance, image_meta, [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] self._fetch_image_if_missing(context, vi) [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] image_cache(vi, tmp_image_ds_loc) [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] vm_util.copy_virtual_disk( [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] session._wait_for_task(vmdk_copy_task) [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] return self.wait_for_task(task_ref) [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] return evt.wait() [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] result = hub.switch() [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] return self.greenlet.switch() [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] self.f(*self.args, **self.kw) [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] raise exceptions.translate_fault(task_info.error) [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Faults: ['InvalidArgument'] [ 2003.878097] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] [ 2003.879131] env[68964]: INFO nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Terminating instance [ 2003.880447] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2003.880662] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2003.880900] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7806c12f-b3c1-4829-915d-7f11fcbe244e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.883274] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2003.883479] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2003.884207] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c45f647d-ad95-47fe-8871-180bf623ae92 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.890654] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2003.890901] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-345d374d-be67-441a-ad84-1199f430aea0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.892959] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2003.893144] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2003.894114] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-09daeeba-f0a1-4943-b74c-f90dde3044b2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.898748] env[68964]: DEBUG oslo_vmware.api [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Waiting for the task: (returnval){ [ 2003.898748] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52826cbd-4ddf-1c8e-e060-b68d0e81855f" [ 2003.898748] env[68964]: _type = "Task" [ 2003.898748] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2003.910585] env[68964]: DEBUG oslo_vmware.api [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52826cbd-4ddf-1c8e-e060-b68d0e81855f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2003.964084] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2003.964317] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2003.964495] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Deleting the datastore file [datastore1] 094b1346-f24b-4360-b7c8-46fd2f2c668f {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2003.964758] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b7664737-91d3-46b0-8320-8bdca6db464a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.970644] env[68964]: DEBUG oslo_vmware.api [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Waiting for the task: (returnval){ [ 2003.970644] env[68964]: value = "task-3431774" [ 2003.970644] env[68964]: _type = "Task" [ 2003.970644] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2003.979524] env[68964]: DEBUG oslo_vmware.api [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Task: {'id': task-3431774, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2004.409564] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2004.409855] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Creating directory with path [datastore1] vmware_temp/dce2199d-eeb4-4cb4-aacf-d03baefe632a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2004.410071] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a653edee-f286-4579-8728-90dd974be110 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.421130] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Created directory with path [datastore1] vmware_temp/dce2199d-eeb4-4cb4-aacf-d03baefe632a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2004.421323] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Fetch image to [datastore1] vmware_temp/dce2199d-eeb4-4cb4-aacf-d03baefe632a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2004.421493] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/dce2199d-eeb4-4cb4-aacf-d03baefe632a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2004.422239] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de007c8a-eace-43ea-a9ed-8112f06e3b4b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.429057] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a3878ef-45ec-44c7-8a52-acc7cfb98f30 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.438018] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef66549c-9599-43c4-877d-5a3409b2ad66 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.468670] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4302a36c-910f-4238-8cde-1e8d52e1a9e5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.479377] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5b32370f-9c97-415c-8de4-a465543f59fd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.481045] env[68964]: DEBUG oslo_vmware.api [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Task: {'id': task-3431774, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062436} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2004.481289] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2004.481475] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2004.481639] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2004.481823] env[68964]: INFO nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2004.483879] env[68964]: DEBUG nova.compute.claims [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2004.484055] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2004.484267] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2004.503358] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2004.566964] env[68964]: DEBUG oslo_vmware.rw_handles [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dce2199d-eeb4-4cb4-aacf-d03baefe632a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2004.631430] env[68964]: DEBUG oslo_vmware.rw_handles [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2004.631694] env[68964]: DEBUG oslo_vmware.rw_handles [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dce2199d-eeb4-4cb4-aacf-d03baefe632a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2004.725022] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2004.728768] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9b69c2d-fa5a-4df3-a27a-8d9fac2144bb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.737470] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-823e4605-6cc3-4bcc-b5b0-9e96a17dd228 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.741114] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2004.770247] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99007146-1d1c-44e9-bab5-37d2e4131582 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.777432] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d00296a-8a24-4a78-b9f1-659d37c1ea18 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.790385] env[68964]: DEBUG nova.compute.provider_tree [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2004.800022] env[68964]: DEBUG nova.scheduler.client.report [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2004.812368] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.328s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2004.812891] env[68964]: ERROR nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2004.812891] env[68964]: Faults: ['InvalidArgument'] [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Traceback (most recent call last): [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] self.driver.spawn(context, instance, image_meta, [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] self._fetch_image_if_missing(context, vi) [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] image_cache(vi, tmp_image_ds_loc) [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] vm_util.copy_virtual_disk( [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] session._wait_for_task(vmdk_copy_task) [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] return self.wait_for_task(task_ref) [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] return evt.wait() [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] result = hub.switch() [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] return self.greenlet.switch() [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] self.f(*self.args, **self.kw) [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] raise exceptions.translate_fault(task_info.error) [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Faults: ['InvalidArgument'] [ 2004.812891] env[68964]: ERROR nova.compute.manager [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] [ 2004.813790] env[68964]: DEBUG nova.compute.utils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2004.814627] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.074s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2004.814803] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2004.814954] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2004.815961] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa354b35-6d9c-4d95-bdb8-f280042bb25d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.819363] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Build of instance 094b1346-f24b-4360-b7c8-46fd2f2c668f was re-scheduled: A specified parameter was not correct: fileType [ 2004.819363] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2004.819737] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2004.819907] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2004.820091] env[68964]: DEBUG nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2004.820257] env[68964]: DEBUG nova.network.neutron [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2004.826700] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7be48fc-cd7c-4267-acc7-c83a2598c22f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.840478] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca8a866a-9f77-4ab6-9880-2289fe7f1dad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.846702] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d3a6aaf-bfd2-43c6-b0fe-5d50b1d1caaf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.876405] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180932MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2004.876562] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2004.876743] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2004.950235] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2004.950397] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2004.950528] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2004.950654] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2004.950774] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2004.950892] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2004.951015] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2004.951140] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2004.951256] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2004.951552] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2004.951552] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2004.951685] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2005.082031] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0acd7d34-7a8a-4bcb-bcc7-1cf1c0bbfed7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2005.089031] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c854530-7e5c-4dba-86f6-97394f349c30 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2005.123547] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f9779cb-b5c3-4609-b91f-393f9e6fa006 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2005.132018] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fba1ddb0-fd57-46da-9a9d-8589f03c3f64 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2005.136154] env[68964]: DEBUG nova.network.neutron [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2005.148558] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2005.151125] env[68964]: INFO nova.compute.manager [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Took 0.33 seconds to deallocate network for instance. [ 2005.155728] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2005.167931] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2005.168122] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2005.237522] env[68964]: INFO nova.scheduler.client.report [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Deleted allocations for instance 094b1346-f24b-4360-b7c8-46fd2f2c668f [ 2005.259477] env[68964]: DEBUG oslo_concurrency.lockutils [None req-47f77627-3abd-48b1-9862-dca328faf85c tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 689.423s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2005.259477] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 493.127s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2005.259477] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquiring lock "094b1346-f24b-4360-b7c8-46fd2f2c668f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2005.259477] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2005.259477] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2005.262730] env[68964]: INFO nova.compute.manager [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Terminating instance [ 2005.264517] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquiring lock "refresh_cache-094b1346-f24b-4360-b7c8-46fd2f2c668f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2005.264675] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Acquired lock "refresh_cache-094b1346-f24b-4360-b7c8-46fd2f2c668f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2005.265576] env[68964]: DEBUG nova.network.neutron [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2005.292253] env[68964]: DEBUG nova.network.neutron [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2005.394641] env[68964]: DEBUG nova.network.neutron [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2005.403885] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Releasing lock "refresh_cache-094b1346-f24b-4360-b7c8-46fd2f2c668f" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2005.404368] env[68964]: DEBUG nova.compute.manager [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2005.404595] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2005.405196] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-78ad1db4-c2de-4c21-926e-71e01fad5c13 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2005.416099] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-118f88a2-dd2b-41b8-ab71-bced29a35098 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2005.444489] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 094b1346-f24b-4360-b7c8-46fd2f2c668f could not be found. [ 2005.444716] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2005.444888] env[68964]: INFO nova.compute.manager [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2005.445094] env[68964]: DEBUG oslo.service.loopingcall [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2005.445367] env[68964]: DEBUG nova.compute.manager [-] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2005.445468] env[68964]: DEBUG nova.network.neutron [-] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2005.464923] env[68964]: DEBUG nova.network.neutron [-] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2005.473752] env[68964]: DEBUG nova.network.neutron [-] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2005.483467] env[68964]: INFO nova.compute.manager [-] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] Took 0.04 seconds to deallocate network for instance. [ 2005.575983] env[68964]: DEBUG oslo_concurrency.lockutils [None req-f0cf79cf-5f60-4ec8-a7d6-af435880807a tempest-ServersTestFqdnHostnames-1144406634 tempest-ServersTestFqdnHostnames-1144406634-project-member] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.317s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2005.576936] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 152.754s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2005.577136] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 094b1346-f24b-4360-b7c8-46fd2f2c668f] During sync_power_state the instance has a pending task (deleting). Skip. [ 2005.577311] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "094b1346-f24b-4360-b7c8-46fd2f2c668f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2014.301233] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "be9830e6-1e07-443b-b08e-cefac29e2e5c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2014.301512] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Lock "be9830e6-1e07-443b-b08e-cefac29e2e5c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2014.311974] env[68964]: DEBUG nova.compute.manager [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2014.361218] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2014.361457] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2014.362885] env[68964]: INFO nova.compute.claims [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2014.534056] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-212fff0b-67c8-4c93-989f-e19eb80df42b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2014.542455] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e886690c-64a3-4cbe-8b77-6fa91dce3bf6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2014.573048] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8564350b-908a-44a5-8fc8-ec52ea479460 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2014.579643] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93bc88b7-3c7a-4f52-ad0c-5d4fc9868947 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2014.592687] env[68964]: DEBUG nova.compute.provider_tree [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2014.601209] env[68964]: DEBUG nova.scheduler.client.report [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2014.616860] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2014.617329] env[68964]: DEBUG nova.compute.manager [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2014.651125] env[68964]: DEBUG nova.compute.utils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2014.652504] env[68964]: DEBUG nova.compute.manager [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2014.652710] env[68964]: DEBUG nova.network.neutron [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2014.660331] env[68964]: DEBUG nova.compute.manager [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2014.720715] env[68964]: DEBUG nova.policy [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57b204a484c24d2eaa9a909b7c831bc9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '315c6290ec974ff0b91c8856a6716aa3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2014.724423] env[68964]: DEBUG nova.compute.manager [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2014.748892] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2014.749145] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2014.749301] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2014.749478] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2014.749622] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2014.749762] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2014.749963] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2014.750771] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2014.750771] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2014.750771] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2014.750771] env[68964]: DEBUG nova.virt.hardware [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2014.751441] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e532ba9-8514-4c1d-8c1a-1af84c5b23c4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2014.759260] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba735788-1c00-4803-bed1-7a07c7870d7c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2015.059655] env[68964]: DEBUG nova.network.neutron [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Successfully created port: 43ce7d87-4a87-4c49-8566-938c243f8038 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2015.873714] env[68964]: DEBUG nova.compute.manager [req-757f224d-f65f-4567-8a46-0bd3b95ac1f0 req-85699478-849b-4826-bc35-46eb6f86da87 service nova] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Received event network-vif-plugged-43ce7d87-4a87-4c49-8566-938c243f8038 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2015.875076] env[68964]: DEBUG oslo_concurrency.lockutils [req-757f224d-f65f-4567-8a46-0bd3b95ac1f0 req-85699478-849b-4826-bc35-46eb6f86da87 service nova] Acquiring lock "be9830e6-1e07-443b-b08e-cefac29e2e5c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2015.875076] env[68964]: DEBUG oslo_concurrency.lockutils [req-757f224d-f65f-4567-8a46-0bd3b95ac1f0 req-85699478-849b-4826-bc35-46eb6f86da87 service nova] Lock "be9830e6-1e07-443b-b08e-cefac29e2e5c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2015.875076] env[68964]: DEBUG oslo_concurrency.lockutils [req-757f224d-f65f-4567-8a46-0bd3b95ac1f0 req-85699478-849b-4826-bc35-46eb6f86da87 service nova] Lock "be9830e6-1e07-443b-b08e-cefac29e2e5c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2015.875076] env[68964]: DEBUG nova.compute.manager [req-757f224d-f65f-4567-8a46-0bd3b95ac1f0 req-85699478-849b-4826-bc35-46eb6f86da87 service nova] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] No waiting events found dispatching network-vif-plugged-43ce7d87-4a87-4c49-8566-938c243f8038 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2015.875076] env[68964]: WARNING nova.compute.manager [req-757f224d-f65f-4567-8a46-0bd3b95ac1f0 req-85699478-849b-4826-bc35-46eb6f86da87 service nova] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Received unexpected event network-vif-plugged-43ce7d87-4a87-4c49-8566-938c243f8038 for instance with vm_state building and task_state spawning. [ 2015.955180] env[68964]: DEBUG nova.network.neutron [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Successfully updated port: 43ce7d87-4a87-4c49-8566-938c243f8038 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2015.982149] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "refresh_cache-be9830e6-1e07-443b-b08e-cefac29e2e5c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2015.982149] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquired lock "refresh_cache-be9830e6-1e07-443b-b08e-cefac29e2e5c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2015.982279] env[68964]: DEBUG nova.network.neutron [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2016.017087] env[68964]: DEBUG nova.network.neutron [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2016.188371] env[68964]: DEBUG nova.network.neutron [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Updating instance_info_cache with network_info: [{"id": "43ce7d87-4a87-4c49-8566-938c243f8038", "address": "fa:16:3e:78:b8:43", "network": {"id": "0f42ddf1-c83c-4ac8-bfeb-12d61e91369b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1180077751-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "315c6290ec974ff0b91c8856a6716aa3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43ce7d87-4a", "ovs_interfaceid": "43ce7d87-4a87-4c49-8566-938c243f8038", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2016.200656] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Releasing lock "refresh_cache-be9830e6-1e07-443b-b08e-cefac29e2e5c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2016.200995] env[68964]: DEBUG nova.compute.manager [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Instance network_info: |[{"id": "43ce7d87-4a87-4c49-8566-938c243f8038", "address": "fa:16:3e:78:b8:43", "network": {"id": "0f42ddf1-c83c-4ac8-bfeb-12d61e91369b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1180077751-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "315c6290ec974ff0b91c8856a6716aa3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43ce7d87-4a", "ovs_interfaceid": "43ce7d87-4a87-4c49-8566-938c243f8038", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2016.201649] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:b8:43', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5f60c972-a72d-4c5f-a250-faadfd6eafbe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '43ce7d87-4a87-4c49-8566-938c243f8038', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2016.209934] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Creating folder: Project (315c6290ec974ff0b91c8856a6716aa3). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2016.210440] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f45b14db-15d2-494f-a3bf-9dca24916ee0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2016.220660] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Created folder: Project (315c6290ec974ff0b91c8856a6716aa3) in parent group-v684465. [ 2016.220843] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Creating folder: Instances. Parent ref: group-v684605. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2016.221095] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cf6f1fc4-a2d7-41c9-937d-9e30adbe5675 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2016.229412] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Created folder: Instances in parent group-v684605. [ 2016.229668] env[68964]: DEBUG oslo.service.loopingcall [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2016.229860] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2016.230138] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4b0b6ed6-db93-4832-818f-0334314c1298 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2016.248496] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2016.248496] env[68964]: value = "task-3431777" [ 2016.248496] env[68964]: _type = "Task" [ 2016.248496] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2016.255740] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431777, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2016.757823] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431777, 'name': CreateVM_Task, 'duration_secs': 0.274529} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2016.758010] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2016.758689] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2016.758853] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2016.759182] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2016.759430] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e8eba3a8-c834-4241-a278-734daedab167 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2016.763646] env[68964]: DEBUG oslo_vmware.api [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Waiting for the task: (returnval){ [ 2016.763646] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525cd9c5-6df3-f2e1-b9f0-905ce27a9ad1" [ 2016.763646] env[68964]: _type = "Task" [ 2016.763646] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2016.770823] env[68964]: DEBUG oslo_vmware.api [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]525cd9c5-6df3-f2e1-b9f0-905ce27a9ad1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2017.274602] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2017.274920] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2017.275028] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2017.930847] env[68964]: DEBUG nova.compute.manager [req-5d8d8f03-88e4-4875-896c-af427867cb83 req-36c23abe-0502-47d4-bb0f-69b15c357eda service nova] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Received event network-changed-43ce7d87-4a87-4c49-8566-938c243f8038 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2017.931098] env[68964]: DEBUG nova.compute.manager [req-5d8d8f03-88e4-4875-896c-af427867cb83 req-36c23abe-0502-47d4-bb0f-69b15c357eda service nova] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Refreshing instance network info cache due to event network-changed-43ce7d87-4a87-4c49-8566-938c243f8038. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2017.931268] env[68964]: DEBUG oslo_concurrency.lockutils [req-5d8d8f03-88e4-4875-896c-af427867cb83 req-36c23abe-0502-47d4-bb0f-69b15c357eda service nova] Acquiring lock "refresh_cache-be9830e6-1e07-443b-b08e-cefac29e2e5c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2017.931406] env[68964]: DEBUG oslo_concurrency.lockutils [req-5d8d8f03-88e4-4875-896c-af427867cb83 req-36c23abe-0502-47d4-bb0f-69b15c357eda service nova] Acquired lock "refresh_cache-be9830e6-1e07-443b-b08e-cefac29e2e5c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2017.931561] env[68964]: DEBUG nova.network.neutron [req-5d8d8f03-88e4-4875-896c-af427867cb83 req-36c23abe-0502-47d4-bb0f-69b15c357eda service nova] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Refreshing network info cache for port 43ce7d87-4a87-4c49-8566-938c243f8038 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2018.202965] env[68964]: DEBUG nova.network.neutron [req-5d8d8f03-88e4-4875-896c-af427867cb83 req-36c23abe-0502-47d4-bb0f-69b15c357eda service nova] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Updated VIF entry in instance network info cache for port 43ce7d87-4a87-4c49-8566-938c243f8038. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2018.203337] env[68964]: DEBUG nova.network.neutron [req-5d8d8f03-88e4-4875-896c-af427867cb83 req-36c23abe-0502-47d4-bb0f-69b15c357eda service nova] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Updating instance_info_cache with network_info: [{"id": "43ce7d87-4a87-4c49-8566-938c243f8038", "address": "fa:16:3e:78:b8:43", "network": {"id": "0f42ddf1-c83c-4ac8-bfeb-12d61e91369b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1180077751-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "315c6290ec974ff0b91c8856a6716aa3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43ce7d87-4a", "ovs_interfaceid": "43ce7d87-4a87-4c49-8566-938c243f8038", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2018.233215] env[68964]: DEBUG oslo_concurrency.lockutils [req-5d8d8f03-88e4-4875-896c-af427867cb83 req-36c23abe-0502-47d4-bb0f-69b15c357eda service nova] Releasing lock "refresh_cache-be9830e6-1e07-443b-b08e-cefac29e2e5c" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2051.889131] env[68964]: WARNING oslo_vmware.rw_handles [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2051.889131] env[68964]: ERROR oslo_vmware.rw_handles [ 2051.889131] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/dce2199d-eeb4-4cb4-aacf-d03baefe632a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2051.890433] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2051.890680] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Copying Virtual Disk [datastore1] vmware_temp/dce2199d-eeb4-4cb4-aacf-d03baefe632a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/dce2199d-eeb4-4cb4-aacf-d03baefe632a/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2051.890971] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-59435ca5-7251-4c33-8b89-f011c442fa1a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.898412] env[68964]: DEBUG oslo_vmware.api [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Waiting for the task: (returnval){ [ 2051.898412] env[68964]: value = "task-3431778" [ 2051.898412] env[68964]: _type = "Task" [ 2051.898412] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2051.906263] env[68964]: DEBUG oslo_vmware.api [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Task: {'id': task-3431778, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2052.409730] env[68964]: DEBUG oslo_vmware.exceptions [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2052.409877] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2052.410365] env[68964]: ERROR nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2052.410365] env[68964]: Faults: ['InvalidArgument'] [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Traceback (most recent call last): [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] yield resources [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] self.driver.spawn(context, instance, image_meta, [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] self._fetch_image_if_missing(context, vi) [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] image_cache(vi, tmp_image_ds_loc) [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] vm_util.copy_virtual_disk( [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] session._wait_for_task(vmdk_copy_task) [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] return self.wait_for_task(task_ref) [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] return evt.wait() [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] result = hub.switch() [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] return self.greenlet.switch() [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] self.f(*self.args, **self.kw) [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] raise exceptions.translate_fault(task_info.error) [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Faults: ['InvalidArgument'] [ 2052.410365] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] [ 2052.411264] env[68964]: INFO nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Terminating instance [ 2052.412247] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2052.412503] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2052.412740] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4ab40af5-1647-406d-8c06-64889bd38fec {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.415189] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2052.415390] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2052.416136] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eca934a6-ab43-4147-922f-38e0b379625c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.422798] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2052.423028] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d0b518d5-4293-4323-ab68-059c1a42dfb5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.425165] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2052.425334] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2052.426346] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-58689a67-4990-49ea-b005-ebc466cce2a0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.431099] env[68964]: DEBUG oslo_vmware.api [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for the task: (returnval){ [ 2052.431099] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ab6184-5ecf-abb5-8ace-84b55cc356bc" [ 2052.431099] env[68964]: _type = "Task" [ 2052.431099] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2052.438010] env[68964]: DEBUG oslo_vmware.api [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52ab6184-5ecf-abb5-8ace-84b55cc356bc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2052.487165] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2052.487388] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2052.487546] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Deleting the datastore file [datastore1] 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2052.487813] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d9496f4a-22c8-4f3c-83a6-aca988346ef7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.494681] env[68964]: DEBUG oslo_vmware.api [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Waiting for the task: (returnval){ [ 2052.494681] env[68964]: value = "task-3431780" [ 2052.494681] env[68964]: _type = "Task" [ 2052.494681] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2052.502411] env[68964]: DEBUG oslo_vmware.api [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Task: {'id': task-3431780, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2052.941545] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2052.941855] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Creating directory with path [datastore1] vmware_temp/e69eae08-0646-4691-9e86-636b0b5d1a36/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2052.942055] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-10be5f5e-3307-430f-9dbc-105bf9a5ec32 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.953912] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Created directory with path [datastore1] vmware_temp/e69eae08-0646-4691-9e86-636b0b5d1a36/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2052.954144] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Fetch image to [datastore1] vmware_temp/e69eae08-0646-4691-9e86-636b0b5d1a36/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2052.954364] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/e69eae08-0646-4691-9e86-636b0b5d1a36/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2052.955110] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc5e1672-c0dc-4618-9f54-a3ac1d6e5e35 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.961933] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8098c13f-9dc5-4660-8623-a8f3c770e404 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.971975] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6d911a8-f4c3-4968-a850-9f86bcb55abe {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.004899] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ef7b21e-e484-416e-85e3-be83fe1db348 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.012044] env[68964]: DEBUG oslo_vmware.api [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Task: {'id': task-3431780, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084828} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2053.013394] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2053.013587] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2053.013758] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2053.013927] env[68964]: INFO nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2053.016015] env[68964]: DEBUG nova.compute.claims [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2053.016197] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2053.016445] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.018865] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e68653eb-b9db-4b04-bdae-08e6197f65f2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.040598] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2053.093014] env[68964]: DEBUG oslo_vmware.rw_handles [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e69eae08-0646-4691-9e86-636b0b5d1a36/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2053.152745] env[68964]: DEBUG oslo_vmware.rw_handles [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2053.152935] env[68964]: DEBUG oslo_vmware.rw_handles [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e69eae08-0646-4691-9e86-636b0b5d1a36/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2053.235993] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0450214e-423b-4fdd-a965-1dd68a9a9abd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.243706] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c18915d-7b29-4ba5-a4a4-674df2eaa61f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.275158] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a6d69c3-ba3d-4cc2-8989-32e7790cee25 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.282473] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db869988-7557-45cd-b297-3efe33851891 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.295883] env[68964]: DEBUG nova.compute.provider_tree [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2053.304634] env[68964]: DEBUG nova.scheduler.client.report [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2053.323019] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.306s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.323579] env[68964]: ERROR nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2053.323579] env[68964]: Faults: ['InvalidArgument'] [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Traceback (most recent call last): [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] self.driver.spawn(context, instance, image_meta, [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] self._fetch_image_if_missing(context, vi) [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] image_cache(vi, tmp_image_ds_loc) [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] vm_util.copy_virtual_disk( [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] session._wait_for_task(vmdk_copy_task) [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] return self.wait_for_task(task_ref) [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] return evt.wait() [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] result = hub.switch() [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] return self.greenlet.switch() [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] self.f(*self.args, **self.kw) [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] raise exceptions.translate_fault(task_info.error) [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Faults: ['InvalidArgument'] [ 2053.323579] env[68964]: ERROR nova.compute.manager [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] [ 2053.324496] env[68964]: DEBUG nova.compute.utils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2053.325941] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Build of instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 was re-scheduled: A specified parameter was not correct: fileType [ 2053.325941] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2053.326360] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2053.326546] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2053.326718] env[68964]: DEBUG nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2053.326880] env[68964]: DEBUG nova.network.neutron [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2053.664350] env[68964]: DEBUG nova.network.neutron [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2053.679065] env[68964]: INFO nova.compute.manager [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Took 0.35 seconds to deallocate network for instance. [ 2053.767032] env[68964]: INFO nova.scheduler.client.report [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Deleted allocations for instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 [ 2053.788836] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2b261417-52f2-4edf-bf5e-e18ef92c0e91 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 584.975s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.789275] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 389.126s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.789379] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Acquiring lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2053.789552] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.789716] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.791678] env[68964]: INFO nova.compute.manager [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Terminating instance [ 2053.793289] env[68964]: DEBUG nova.compute.manager [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2053.793482] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2053.793942] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5f45ed5e-2c9a-459d-afff-5d610dc104d1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.803042] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4a70179-fc43-41fa-8ee4-02c8829ae92a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.831188] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70 could not be found. [ 2053.831947] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2053.831947] env[68964]: INFO nova.compute.manager [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2053.831947] env[68964]: DEBUG oslo.service.loopingcall [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2053.832054] env[68964]: DEBUG nova.compute.manager [-] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2053.832156] env[68964]: DEBUG nova.network.neutron [-] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2053.877574] env[68964]: DEBUG nova.network.neutron [-] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2053.886196] env[68964]: INFO nova.compute.manager [-] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] Took 0.05 seconds to deallocate network for instance. [ 2053.970772] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7dd66a3c-2ac6-4abb-9769-416ebb374408 tempest-ServerGroupTestJSON-343765009 tempest-ServerGroupTestJSON-343765009-project-member] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.971586] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 201.149s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.971770] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70] During sync_power_state the instance has a pending task (deleting). Skip. [ 2053.971943] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "8689d0f5-9ec5-4de7-b6b0-8cf6d0f05f70" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2058.168224] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2059.133013] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2059.724603] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2059.725021] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2059.725021] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2059.745802] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2059.745947] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2059.746094] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2059.746224] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2059.746393] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2059.746514] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2059.746636] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2059.746752] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2059.746868] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2059.746987] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2060.725407] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2060.725407] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2060.725407] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2060.725407] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2062.721157] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2063.724962] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2063.724962] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2064.724588] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2065.724837] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2065.738407] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2065.738407] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2065.738538] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2065.738649] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2065.739794] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e33aa80b-66f8-4600-a6c0-a53998088bbd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2065.748452] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-478ff0c5-e301-4840-99a3-5d32a02fed10 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2065.762478] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f228f08-b3b3-48da-aad8-4e1d1e7b7ab9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2065.768945] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13a2c0ba-d08d-4296-9120-4f56c8590327 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2065.799320] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180925MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2065.799462] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2065.799681] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2065.864180] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2065.864366] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2065.864500] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2065.864626] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2065.864747] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2065.864867] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2065.864982] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2065.865112] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2065.865228] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance be9830e6-1e07-443b-b08e-cefac29e2e5c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2065.865412] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2065.865549] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2065.972067] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87757c73-f74f-4461-8127-d3bb11c66124 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2065.980256] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ad0339f-ea22-44a3-8fcd-bf05ce0a5c03 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2066.009956] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0812e7f1-d609-4e8d-a0e2-eec15b7bc345 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2066.016585] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83d93330-732e-4d83-b7a6-f31b673748eb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2066.029150] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2066.037296] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2066.052197] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2066.052364] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2068.080244] env[68964]: DEBUG oslo_concurrency.lockutils [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "cc90a5a6-19e6-4674-ad06-2c840927409d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2082.990323] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2082.990637] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2083.000940] env[68964]: DEBUG nova.compute.manager [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2083.048121] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2083.048371] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2083.049784] env[68964]: INFO nova.compute.claims [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2083.198397] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-638bf7f3-75d2-4112-ae80-5a26284a6440 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.206039] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd14c32b-a59a-4b22-a02c-c98028e4a424 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.237052] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eae6f2e0-faff-4e0b-b5f1-6d8c805d8237 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.243903] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcc6cd0e-fce9-47d3-9191-f16349ae3579 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.256920] env[68964]: DEBUG nova.compute.provider_tree [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2083.265594] env[68964]: DEBUG nova.scheduler.client.report [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2083.278802] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2083.279291] env[68964]: DEBUG nova.compute.manager [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2083.310659] env[68964]: DEBUG nova.compute.utils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2083.312171] env[68964]: DEBUG nova.compute.manager [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2083.312362] env[68964]: DEBUG nova.network.neutron [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2083.322771] env[68964]: DEBUG nova.compute.manager [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2083.385437] env[68964]: DEBUG nova.compute.manager [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2083.405051] env[68964]: DEBUG nova.policy [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21b5b62c1d9a4afc8e26b122ce6de51c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '07b4913b8fef4ee3a0d920bc36fefd18', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2083.408611] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2083.408846] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2083.409011] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2083.409195] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2083.409345] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2083.409490] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2083.409699] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2083.409860] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2083.410061] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2083.410206] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2083.410373] env[68964]: DEBUG nova.virt.hardware [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2083.411518] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c941f5c-25be-4688-ae62-b766ef3ba52d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.419310] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac36b7e6-1520-4f91-a6ae-0b938dcb1313 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2083.755822] env[68964]: DEBUG nova.network.neutron [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Successfully created port: aaca2b41-bc90-4e18-a636-ff9daaa5b052 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2084.286108] env[68964]: DEBUG nova.compute.manager [req-5d801acd-0e13-421c-a5bf-75558b1ca0fc req-aeaaf339-ff4d-43da-a2c0-c1d54bedbe3e service nova] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Received event network-vif-plugged-aaca2b41-bc90-4e18-a636-ff9daaa5b052 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2084.286367] env[68964]: DEBUG oslo_concurrency.lockutils [req-5d801acd-0e13-421c-a5bf-75558b1ca0fc req-aeaaf339-ff4d-43da-a2c0-c1d54bedbe3e service nova] Acquiring lock "f4fdc36a-1a04-46ac-84ad-a6a05ae64e61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2084.286556] env[68964]: DEBUG oslo_concurrency.lockutils [req-5d801acd-0e13-421c-a5bf-75558b1ca0fc req-aeaaf339-ff4d-43da-a2c0-c1d54bedbe3e service nova] Lock "f4fdc36a-1a04-46ac-84ad-a6a05ae64e61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2084.286698] env[68964]: DEBUG oslo_concurrency.lockutils [req-5d801acd-0e13-421c-a5bf-75558b1ca0fc req-aeaaf339-ff4d-43da-a2c0-c1d54bedbe3e service nova] Lock "f4fdc36a-1a04-46ac-84ad-a6a05ae64e61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2084.286880] env[68964]: DEBUG nova.compute.manager [req-5d801acd-0e13-421c-a5bf-75558b1ca0fc req-aeaaf339-ff4d-43da-a2c0-c1d54bedbe3e service nova] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] No waiting events found dispatching network-vif-plugged-aaca2b41-bc90-4e18-a636-ff9daaa5b052 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2084.287072] env[68964]: WARNING nova.compute.manager [req-5d801acd-0e13-421c-a5bf-75558b1ca0fc req-aeaaf339-ff4d-43da-a2c0-c1d54bedbe3e service nova] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Received unexpected event network-vif-plugged-aaca2b41-bc90-4e18-a636-ff9daaa5b052 for instance with vm_state building and task_state spawning. [ 2084.371755] env[68964]: DEBUG nova.network.neutron [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Successfully updated port: aaca2b41-bc90-4e18-a636-ff9daaa5b052 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2084.382511] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "refresh_cache-f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2084.382655] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "refresh_cache-f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2084.382808] env[68964]: DEBUG nova.network.neutron [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2084.418782] env[68964]: DEBUG nova.network.neutron [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2084.571349] env[68964]: DEBUG nova.network.neutron [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Updating instance_info_cache with network_info: [{"id": "aaca2b41-bc90-4e18-a636-ff9daaa5b052", "address": "fa:16:3e:8a:d5:af", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaaca2b41-bc", "ovs_interfaceid": "aaca2b41-bc90-4e18-a636-ff9daaa5b052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2084.582860] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "refresh_cache-f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2084.583165] env[68964]: DEBUG nova.compute.manager [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Instance network_info: |[{"id": "aaca2b41-bc90-4e18-a636-ff9daaa5b052", "address": "fa:16:3e:8a:d5:af", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaaca2b41-bc", "ovs_interfaceid": "aaca2b41-bc90-4e18-a636-ff9daaa5b052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2084.583555] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8a:d5:af', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92fe29b3-0907-453d-aabb-5559c4bd7c0f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'aaca2b41-bc90-4e18-a636-ff9daaa5b052', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2084.590889] env[68964]: DEBUG oslo.service.loopingcall [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2084.591336] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2084.591563] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7509e83c-8e14-4e3f-87a8-69a3410b3ecb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2084.611909] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2084.611909] env[68964]: value = "task-3431781" [ 2084.611909] env[68964]: _type = "Task" [ 2084.611909] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2084.619306] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431781, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2085.123757] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431781, 'name': CreateVM_Task, 'duration_secs': 0.277538} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2085.127019] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2085.127019] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2085.127019] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2085.127019] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2085.127019] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f74a74a0-1cb8-4e3d-b894-5c62adbe6a0d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2085.130332] env[68964]: DEBUG oslo_vmware.api [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 2085.130332] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52dd83f5-5acc-e983-3519-e59e4015dc5b" [ 2085.130332] env[68964]: _type = "Task" [ 2085.130332] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2085.137874] env[68964]: DEBUG oslo_vmware.api [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52dd83f5-5acc-e983-3519-e59e4015dc5b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2085.641561] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2085.641561] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2085.641561] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5be09146-1c85-4a38-8d60-86d1c3ef65e3 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2086.315552] env[68964]: DEBUG nova.compute.manager [req-41c2854e-b377-4ccd-83bd-f2f5830af0f5 req-1f5898a0-8eea-4504-be8d-532b5ebf6481 service nova] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Received event network-changed-aaca2b41-bc90-4e18-a636-ff9daaa5b052 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2086.315677] env[68964]: DEBUG nova.compute.manager [req-41c2854e-b377-4ccd-83bd-f2f5830af0f5 req-1f5898a0-8eea-4504-be8d-532b5ebf6481 service nova] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Refreshing instance network info cache due to event network-changed-aaca2b41-bc90-4e18-a636-ff9daaa5b052. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2086.315909] env[68964]: DEBUG oslo_concurrency.lockutils [req-41c2854e-b377-4ccd-83bd-f2f5830af0f5 req-1f5898a0-8eea-4504-be8d-532b5ebf6481 service nova] Acquiring lock "refresh_cache-f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2086.316224] env[68964]: DEBUG oslo_concurrency.lockutils [req-41c2854e-b377-4ccd-83bd-f2f5830af0f5 req-1f5898a0-8eea-4504-be8d-532b5ebf6481 service nova] Acquired lock "refresh_cache-f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2086.316419] env[68964]: DEBUG nova.network.neutron [req-41c2854e-b377-4ccd-83bd-f2f5830af0f5 req-1f5898a0-8eea-4504-be8d-532b5ebf6481 service nova] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Refreshing network info cache for port aaca2b41-bc90-4e18-a636-ff9daaa5b052 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2086.772807] env[68964]: DEBUG nova.network.neutron [req-41c2854e-b377-4ccd-83bd-f2f5830af0f5 req-1f5898a0-8eea-4504-be8d-532b5ebf6481 service nova] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Updated VIF entry in instance network info cache for port aaca2b41-bc90-4e18-a636-ff9daaa5b052. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2086.773160] env[68964]: DEBUG nova.network.neutron [req-41c2854e-b377-4ccd-83bd-f2f5830af0f5 req-1f5898a0-8eea-4504-be8d-532b5ebf6481 service nova] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Updating instance_info_cache with network_info: [{"id": "aaca2b41-bc90-4e18-a636-ff9daaa5b052", "address": "fa:16:3e:8a:d5:af", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaaca2b41-bc", "ovs_interfaceid": "aaca2b41-bc90-4e18-a636-ff9daaa5b052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2086.782094] env[68964]: DEBUG oslo_concurrency.lockutils [req-41c2854e-b377-4ccd-83bd-f2f5830af0f5 req-1f5898a0-8eea-4504-be8d-532b5ebf6481 service nova] Releasing lock "refresh_cache-f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2098.358769] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Acquiring lock "dcd4de94-0433-416d-a9f6-c24f584a80ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2098.358769] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Lock "dcd4de94-0433-416d-a9f6-c24f584a80ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2102.349930] env[68964]: WARNING oslo_vmware.rw_handles [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2102.349930] env[68964]: ERROR oslo_vmware.rw_handles [ 2102.350437] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/e69eae08-0646-4691-9e86-636b0b5d1a36/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2102.352193] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2102.352432] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Copying Virtual Disk [datastore1] vmware_temp/e69eae08-0646-4691-9e86-636b0b5d1a36/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/e69eae08-0646-4691-9e86-636b0b5d1a36/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2102.352727] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-045d0852-a14a-4a37-80fb-c6be0342d924 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.360727] env[68964]: DEBUG oslo_vmware.api [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for the task: (returnval){ [ 2102.360727] env[68964]: value = "task-3431782" [ 2102.360727] env[68964]: _type = "Task" [ 2102.360727] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2102.368502] env[68964]: DEBUG oslo_vmware.api [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': task-3431782, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2102.870965] env[68964]: DEBUG oslo_vmware.exceptions [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2102.871274] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2102.871828] env[68964]: ERROR nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2102.871828] env[68964]: Faults: ['InvalidArgument'] [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Traceback (most recent call last): [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] yield resources [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] self.driver.spawn(context, instance, image_meta, [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] self._fetch_image_if_missing(context, vi) [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] image_cache(vi, tmp_image_ds_loc) [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] vm_util.copy_virtual_disk( [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] session._wait_for_task(vmdk_copy_task) [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] return self.wait_for_task(task_ref) [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] return evt.wait() [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] result = hub.switch() [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] return self.greenlet.switch() [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] self.f(*self.args, **self.kw) [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] raise exceptions.translate_fault(task_info.error) [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Faults: ['InvalidArgument'] [ 2102.871828] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] [ 2102.872903] env[68964]: INFO nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Terminating instance [ 2102.874028] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2102.874028] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2102.874166] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-856d2186-5bdf-411a-9979-29f2a3454dd5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.876194] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2102.876382] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2102.877096] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-877f1be2-fe1d-4e0b-9b83-74f37563213c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.883845] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2102.884061] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-27e060cc-f672-4672-ad53-83dd71004912 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.886162] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2102.886341] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2102.887296] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e44d494d-4b22-4612-b260-6a27147ef557 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.891842] env[68964]: DEBUG oslo_vmware.api [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 2102.891842] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]528df974-6b47-69e9-fdb0-f32e4428f9b3" [ 2102.891842] env[68964]: _type = "Task" [ 2102.891842] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2102.898877] env[68964]: DEBUG oslo_vmware.api [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]528df974-6b47-69e9-fdb0-f32e4428f9b3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2103.270261] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2103.270497] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2103.270642] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Deleting the datastore file [datastore1] 2243b807-c2a0-4917-aae8-5de31dc52e53 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2103.270906] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fd2262b3-d063-4648-8702-1980544b20e2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.277348] env[68964]: DEBUG oslo_vmware.api [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for the task: (returnval){ [ 2103.277348] env[68964]: value = "task-3431784" [ 2103.277348] env[68964]: _type = "Task" [ 2103.277348] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2103.286089] env[68964]: DEBUG oslo_vmware.api [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': task-3431784, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2103.402042] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2103.402252] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating directory with path [datastore1] vmware_temp/ea971e5d-36a8-40c1-9fd6-8d4d7170802e/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2103.402407] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d4d12b04-1f4b-4240-b422-1fc1ef414973 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.414150] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Created directory with path [datastore1] vmware_temp/ea971e5d-36a8-40c1-9fd6-8d4d7170802e/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2103.414342] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Fetch image to [datastore1] vmware_temp/ea971e5d-36a8-40c1-9fd6-8d4d7170802e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2103.414511] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/ea971e5d-36a8-40c1-9fd6-8d4d7170802e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2103.415248] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab574640-9576-4693-ab32-cb285b86bfe5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.421792] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1088b107-3582-478c-8014-d407e14b30c8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.430766] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16b43a36-d9e6-43d7-b735-9672151dbeb9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.460941] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e90a3914-69b7-4db7-9b43-b35e1ec478a8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.466298] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cbca5495-28d3-4c1f-978d-4e0a3ef3756a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.485539] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2103.533143] env[68964]: DEBUG oslo_vmware.rw_handles [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ea971e5d-36a8-40c1-9fd6-8d4d7170802e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2103.592385] env[68964]: DEBUG oslo_vmware.rw_handles [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2103.592603] env[68964]: DEBUG oslo_vmware.rw_handles [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ea971e5d-36a8-40c1-9fd6-8d4d7170802e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2103.787351] env[68964]: DEBUG oslo_vmware.api [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': task-3431784, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076443} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2103.787621] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2103.787810] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2103.787978] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2103.788164] env[68964]: INFO nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Took 0.91 seconds to destroy the instance on the hypervisor. [ 2103.790211] env[68964]: DEBUG nova.compute.claims [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2103.790382] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2103.790600] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2103.972024] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b702db56-c45e-437a-9413-9ca5e3b41377 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.978964] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e93b0a3-a94e-4dda-a086-67be9a904380 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.010834] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62f55da0-69a4-4b4c-a384-d2cda3b12dbd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.018060] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e7a48f4-01d5-423b-8f86-43dd63f880ad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.031079] env[68964]: DEBUG nova.compute.provider_tree [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2104.039818] env[68964]: DEBUG nova.scheduler.client.report [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2104.053481] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.263s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.054162] env[68964]: ERROR nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2104.054162] env[68964]: Faults: ['InvalidArgument'] [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Traceback (most recent call last): [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] self.driver.spawn(context, instance, image_meta, [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] self._fetch_image_if_missing(context, vi) [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] image_cache(vi, tmp_image_ds_loc) [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] vm_util.copy_virtual_disk( [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] session._wait_for_task(vmdk_copy_task) [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] return self.wait_for_task(task_ref) [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] return evt.wait() [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] result = hub.switch() [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] return self.greenlet.switch() [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] self.f(*self.args, **self.kw) [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] raise exceptions.translate_fault(task_info.error) [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Faults: ['InvalidArgument'] [ 2104.054162] env[68964]: ERROR nova.compute.manager [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] [ 2104.055066] env[68964]: DEBUG nova.compute.utils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2104.056189] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Build of instance 2243b807-c2a0-4917-aae8-5de31dc52e53 was re-scheduled: A specified parameter was not correct: fileType [ 2104.056189] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2104.056548] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2104.056719] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2104.056888] env[68964]: DEBUG nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2104.057061] env[68964]: DEBUG nova.network.neutron [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2104.326165] env[68964]: DEBUG nova.network.neutron [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2104.336866] env[68964]: INFO nova.compute.manager [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Took 0.28 seconds to deallocate network for instance. [ 2104.429112] env[68964]: INFO nova.scheduler.client.report [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Deleted allocations for instance 2243b807-c2a0-4917-aae8-5de31dc52e53 [ 2104.460829] env[68964]: DEBUG oslo_concurrency.lockutils [None req-ee227b11-3624-4709-81fa-842a72d087b2 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 634.096s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.462668] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 438.603s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2104.463036] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "2243b807-c2a0-4917-aae8-5de31dc52e53-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2104.463380] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2104.463672] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.466750] env[68964]: INFO nova.compute.manager [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Terminating instance [ 2104.469293] env[68964]: DEBUG nova.compute.manager [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2104.469610] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2104.469994] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7de0d0ca-d0bc-4e1f-be26-6648167e9d4a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.474648] env[68964]: DEBUG nova.compute.manager [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2104.487995] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40854346-ecce-4e2e-a13a-48bf0c623286 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.526186] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2243b807-c2a0-4917-aae8-5de31dc52e53 could not be found. [ 2104.526359] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2104.526535] env[68964]: INFO nova.compute.manager [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Took 0.06 seconds to destroy the instance on the hypervisor. [ 2104.526822] env[68964]: DEBUG oslo.service.loopingcall [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2104.527780] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2104.527997] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2104.529484] env[68964]: INFO nova.compute.claims [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2104.531753] env[68964]: DEBUG nova.compute.manager [-] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2104.531866] env[68964]: DEBUG nova.network.neutron [-] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2104.555889] env[68964]: DEBUG nova.network.neutron [-] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2104.573366] env[68964]: INFO nova.compute.manager [-] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] Took 0.04 seconds to deallocate network for instance. [ 2104.661440] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b32a120e-d2cc-4264-8f2e-61f0243e6c27 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.662695] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 251.839s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2104.662990] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 2243b807-c2a0-4917-aae8-5de31dc52e53] During sync_power_state the instance has a pending task (deleting). Skip. [ 2104.663249] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "2243b807-c2a0-4917-aae8-5de31dc52e53" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.706396] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd81d2e2-56fc-43fd-81d2-493fd25e21eb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.714412] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bde6f634-ce71-4b41-aaf5-29ef0c5f7805 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.744461] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a18ac14e-6390-4de2-a8ba-aadc9738cc11 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.751495] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38eae674-ed43-4cf2-ab58-6423e7414111 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.764588] env[68964]: DEBUG nova.compute.provider_tree [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2104.772980] env[68964]: DEBUG nova.scheduler.client.report [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2104.786330] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.786840] env[68964]: DEBUG nova.compute.manager [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2104.819145] env[68964]: DEBUG nova.compute.utils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2104.821135] env[68964]: DEBUG nova.compute.manager [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2104.821317] env[68964]: DEBUG nova.network.neutron [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2104.829159] env[68964]: DEBUG nova.compute.manager [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2104.881679] env[68964]: DEBUG nova.policy [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8594b23594cb4ddab259336667a168ac', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf14ecac1214417b9f5052fc02e90878', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2104.891223] env[68964]: DEBUG nova.compute.manager [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2104.915372] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2104.915626] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2104.915782] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2104.915959] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2104.916117] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2104.916264] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2104.916469] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2104.916628] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2104.916793] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2104.916951] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2104.917133] env[68964]: DEBUG nova.virt.hardware [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2104.918394] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e940e10-e164-427a-a685-7eccaf7fce59 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.926059] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8480931a-5662-4a80-b8a7-5b9aea93ed5f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2105.229272] env[68964]: DEBUG nova.network.neutron [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Successfully created port: 1b4f6a5e-3354-4e42-bc36-9d828d8854c8 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2105.846514] env[68964]: DEBUG nova.network.neutron [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Successfully updated port: 1b4f6a5e-3354-4e42-bc36-9d828d8854c8 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2105.859487] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Acquiring lock "refresh_cache-dcd4de94-0433-416d-a9f6-c24f584a80ad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2105.859719] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Acquired lock "refresh_cache-dcd4de94-0433-416d-a9f6-c24f584a80ad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2105.859883] env[68964]: DEBUG nova.network.neutron [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2105.899824] env[68964]: DEBUG nova.network.neutron [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2106.053244] env[68964]: DEBUG nova.network.neutron [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Updating instance_info_cache with network_info: [{"id": "1b4f6a5e-3354-4e42-bc36-9d828d8854c8", "address": "fa:16:3e:61:50:da", "network": {"id": "d88d8f7f-9a7f-4925-88cc-cbcda6f17e87", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1773174320-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf14ecac1214417b9f5052fc02e90878", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd72ef32-a57c-43b0-93df-e8a030987d44", "external-id": "nsx-vlan-transportzone-340", "segmentation_id": 340, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b4f6a5e-33", "ovs_interfaceid": "1b4f6a5e-3354-4e42-bc36-9d828d8854c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2106.063721] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Releasing lock "refresh_cache-dcd4de94-0433-416d-a9f6-c24f584a80ad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2106.063985] env[68964]: DEBUG nova.compute.manager [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Instance network_info: |[{"id": "1b4f6a5e-3354-4e42-bc36-9d828d8854c8", "address": "fa:16:3e:61:50:da", "network": {"id": "d88d8f7f-9a7f-4925-88cc-cbcda6f17e87", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1773174320-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf14ecac1214417b9f5052fc02e90878", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd72ef32-a57c-43b0-93df-e8a030987d44", "external-id": "nsx-vlan-transportzone-340", "segmentation_id": 340, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b4f6a5e-33", "ovs_interfaceid": "1b4f6a5e-3354-4e42-bc36-9d828d8854c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2106.064378] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:61:50:da', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dd72ef32-a57c-43b0-93df-e8a030987d44', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1b4f6a5e-3354-4e42-bc36-9d828d8854c8', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2106.071653] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Creating folder: Project (bf14ecac1214417b9f5052fc02e90878). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2106.072131] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6b3a5405-ae4c-4c5c-90db-76963d54a0e1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.083222] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Created folder: Project (bf14ecac1214417b9f5052fc02e90878) in parent group-v684465. [ 2106.083397] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Creating folder: Instances. Parent ref: group-v684609. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2106.083605] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c4e569da-dd7f-42ac-b492-27ecdbf793aa {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.092249] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Created folder: Instances in parent group-v684609. [ 2106.092464] env[68964]: DEBUG oslo.service.loopingcall [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2106.092638] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2106.092817] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e35d3b85-8620-423d-a2fa-31069d6420de {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.111795] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2106.111795] env[68964]: value = "task-3431787" [ 2106.111795] env[68964]: _type = "Task" [ 2106.111795] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2106.118764] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431787, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2106.371203] env[68964]: DEBUG nova.compute.manager [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Received event network-vif-plugged-1b4f6a5e-3354-4e42-bc36-9d828d8854c8 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2106.371898] env[68964]: DEBUG oslo_concurrency.lockutils [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] Acquiring lock "dcd4de94-0433-416d-a9f6-c24f584a80ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2106.372187] env[68964]: DEBUG oslo_concurrency.lockutils [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] Lock "dcd4de94-0433-416d-a9f6-c24f584a80ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2106.372381] env[68964]: DEBUG oslo_concurrency.lockutils [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] Lock "dcd4de94-0433-416d-a9f6-c24f584a80ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2106.372599] env[68964]: DEBUG nova.compute.manager [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] No waiting events found dispatching network-vif-plugged-1b4f6a5e-3354-4e42-bc36-9d828d8854c8 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2106.372807] env[68964]: WARNING nova.compute.manager [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Received unexpected event network-vif-plugged-1b4f6a5e-3354-4e42-bc36-9d828d8854c8 for instance with vm_state building and task_state spawning. [ 2106.373066] env[68964]: DEBUG nova.compute.manager [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Received event network-changed-1b4f6a5e-3354-4e42-bc36-9d828d8854c8 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2106.373281] env[68964]: DEBUG nova.compute.manager [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Refreshing instance network info cache due to event network-changed-1b4f6a5e-3354-4e42-bc36-9d828d8854c8. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2106.373528] env[68964]: DEBUG oslo_concurrency.lockutils [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] Acquiring lock "refresh_cache-dcd4de94-0433-416d-a9f6-c24f584a80ad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2106.373695] env[68964]: DEBUG oslo_concurrency.lockutils [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] Acquired lock "refresh_cache-dcd4de94-0433-416d-a9f6-c24f584a80ad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2106.373859] env[68964]: DEBUG nova.network.neutron [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Refreshing network info cache for port 1b4f6a5e-3354-4e42-bc36-9d828d8854c8 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2106.608992] env[68964]: DEBUG nova.network.neutron [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Updated VIF entry in instance network info cache for port 1b4f6a5e-3354-4e42-bc36-9d828d8854c8. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2106.609380] env[68964]: DEBUG nova.network.neutron [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Updating instance_info_cache with network_info: [{"id": "1b4f6a5e-3354-4e42-bc36-9d828d8854c8", "address": "fa:16:3e:61:50:da", "network": {"id": "d88d8f7f-9a7f-4925-88cc-cbcda6f17e87", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1773174320-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bf14ecac1214417b9f5052fc02e90878", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd72ef32-a57c-43b0-93df-e8a030987d44", "external-id": "nsx-vlan-transportzone-340", "segmentation_id": 340, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b4f6a5e-33", "ovs_interfaceid": "1b4f6a5e-3354-4e42-bc36-9d828d8854c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2106.618680] env[68964]: DEBUG oslo_concurrency.lockutils [req-1db1e7f7-f7c1-4b00-98de-b4a2c3a07c37 req-7d07112f-fcff-437d-b7d2-a54575647141 service nova] Releasing lock "refresh_cache-dcd4de94-0433-416d-a9f6-c24f584a80ad" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2106.622397] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431787, 'name': CreateVM_Task, 'duration_secs': 0.285747} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2106.622558] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2106.623167] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2106.623323] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2106.623631] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2106.623867] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-22bb843a-59df-4bda-a206-f906bce0caff {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.628517] env[68964]: DEBUG oslo_vmware.api [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Waiting for the task: (returnval){ [ 2106.628517] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52af781d-8cd1-31a8-f820-9ed80a36533a" [ 2106.628517] env[68964]: _type = "Task" [ 2106.628517] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2106.636464] env[68964]: DEBUG oslo_vmware.api [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52af781d-8cd1-31a8-f820-9ed80a36533a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2107.140645] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2107.140990] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2107.141099] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a19a3c3d-8a7a-4109-9c4c-67c00f1f63bf tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2111.725699] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2117.736392] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2117.736726] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances with incomplete migration {{(pid=68964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2118.735592] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2119.724520] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2119.724883] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2119.724883] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2119.748017] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.748456] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.748753] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.749023] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.749268] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.749487] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.749701] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.749917] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.750147] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.750385] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2119.750619] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2120.724986] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2120.725429] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2120.725429] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2121.725145] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2124.719736] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2124.724372] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2125.725081] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2126.724683] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2126.737651] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2126.738014] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2126.738165] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2126.738328] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2126.739730] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56ead5d8-41d3-4181-a3cb-690800c32e7d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2126.748407] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef7cb51c-b803-4ebf-b728-ed1932864a1a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2126.762611] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee5ae7d9-4201-4f63-b602-62fb1f5d606b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2126.768810] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a45aee35-df7c-4aea-b105-1d30b6f3cea3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2126.798962] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180918MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2126.799120] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2126.799320] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2126.934506] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8f94d3c8-4674-463d-8829-68a184967183 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.934677] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.934810] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.934935] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.935068] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.935190] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.935307] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.935423] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance be9830e6-1e07-443b-b08e-cefac29e2e5c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.935537] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f4fdc36a-1a04-46ac-84ad-a6a05ae64e61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.935650] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dcd4de94-0433-416d-a9f6-c24f584a80ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2126.935891] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2126.936035] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2127.050983] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb34d5f5-5cb1-4258-a379-d4f37f04a138 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2127.058724] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7267f40-d729-4808-a0c8-a80ac704a461 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2127.088074] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ecc365c-334e-4c36-a0d9-bbeac36c9e98 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2127.095216] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8537691-7032-4290-897c-f8c5b46729a6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2127.108201] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2127.116394] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2127.130007] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2127.130215] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2133.724997] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2133.725273] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2133.733607] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] There are 0 instances to clean {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2152.370174] env[68964]: WARNING oslo_vmware.rw_handles [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2152.370174] env[68964]: ERROR oslo_vmware.rw_handles [ 2152.371132] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/ea971e5d-36a8-40c1-9fd6-8d4d7170802e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2152.372782] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2152.373031] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Copying Virtual Disk [datastore1] vmware_temp/ea971e5d-36a8-40c1-9fd6-8d4d7170802e/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/ea971e5d-36a8-40c1-9fd6-8d4d7170802e/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2152.373312] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0e601ce4-424e-4d9f-89f5-886ffd3f372a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.382921] env[68964]: DEBUG oslo_vmware.api [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 2152.382921] env[68964]: value = "task-3431788" [ 2152.382921] env[68964]: _type = "Task" [ 2152.382921] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2152.390414] env[68964]: DEBUG oslo_vmware.api [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': task-3431788, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2152.893716] env[68964]: DEBUG oslo_vmware.exceptions [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2152.894011] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2152.894572] env[68964]: ERROR nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2152.894572] env[68964]: Faults: ['InvalidArgument'] [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] Traceback (most recent call last): [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] yield resources [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] self.driver.spawn(context, instance, image_meta, [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] self._fetch_image_if_missing(context, vi) [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] image_cache(vi, tmp_image_ds_loc) [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] vm_util.copy_virtual_disk( [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] session._wait_for_task(vmdk_copy_task) [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] return self.wait_for_task(task_ref) [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] return evt.wait() [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] result = hub.switch() [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] return self.greenlet.switch() [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] self.f(*self.args, **self.kw) [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] raise exceptions.translate_fault(task_info.error) [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] Faults: ['InvalidArgument'] [ 2152.894572] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] [ 2152.895353] env[68964]: INFO nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Terminating instance [ 2152.896489] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2152.896690] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2152.896932] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-37dd23db-bbb4-423b-9fcd-491def4ef43d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.899124] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2152.899320] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2152.900035] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-951bf469-7050-4801-95f2-00dca5741de1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.906944] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2152.907186] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e2c3ea27-ae06-499a-a299-4727c9ff43b0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.909297] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2152.909465] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2152.910408] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d42c294-c545-4c98-80b1-e81240a00d93 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.914776] env[68964]: DEBUG oslo_vmware.api [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Waiting for the task: (returnval){ [ 2152.914776] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5200ef7c-b64e-b0c0-9a0b-bdf787c096dc" [ 2152.914776] env[68964]: _type = "Task" [ 2152.914776] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2152.922382] env[68964]: DEBUG oslo_vmware.api [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5200ef7c-b64e-b0c0-9a0b-bdf787c096dc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2152.975151] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2152.975423] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2152.975547] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Deleting the datastore file [datastore1] 8f94d3c8-4674-463d-8829-68a184967183 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2152.975810] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-34f22975-aea5-481b-84c9-56638752af54 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.982588] env[68964]: DEBUG oslo_vmware.api [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 2152.982588] env[68964]: value = "task-3431790" [ 2152.982588] env[68964]: _type = "Task" [ 2152.982588] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2152.989871] env[68964]: DEBUG oslo_vmware.api [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': task-3431790, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2153.427543] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2153.427842] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Creating directory with path [datastore1] vmware_temp/d4f49c21-02fc-4020-a5ac-9d2298c57599/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2153.427982] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0c60f66f-45d5-4810-bf11-702b94adb0b7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.439015] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Created directory with path [datastore1] vmware_temp/d4f49c21-02fc-4020-a5ac-9d2298c57599/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2153.439237] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Fetch image to [datastore1] vmware_temp/d4f49c21-02fc-4020-a5ac-9d2298c57599/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2153.439404] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/d4f49c21-02fc-4020-a5ac-9d2298c57599/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2153.440116] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85ae79ba-6b45-4753-9a70-476ad8b681f1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.446398] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddef0924-7a32-4557-a52d-4330d891b042 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.455079] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e038ddf4-0351-4bcc-8216-c74296696875 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.488894] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1183f2a6-0822-46e1-a6cf-3bfa672efea5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.495574] env[68964]: DEBUG oslo_vmware.api [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': task-3431790, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075026} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2153.497027] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2153.497111] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2153.497286] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2153.497461] env[68964]: INFO nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2153.499202] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-68e9e48d-0373-4a6b-9a63-c082a46f1879 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.500995] env[68964]: DEBUG nova.compute.claims [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2153.501176] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2153.501385] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2153.521108] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2153.659488] env[68964]: DEBUG oslo_vmware.rw_handles [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d4f49c21-02fc-4020-a5ac-9d2298c57599/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2153.715281] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16ee8303-3218-492b-89fd-c462040c696e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.720117] env[68964]: DEBUG oslo_vmware.rw_handles [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2153.720348] env[68964]: DEBUG oslo_vmware.rw_handles [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d4f49c21-02fc-4020-a5ac-9d2298c57599/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2153.724059] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-266fc50c-4f3f-436f-bd2b-0b63944d811c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.754743] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa4a802a-dd34-4e07-82e0-7bea029a3d67 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.761747] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-435a107d-7c4d-4867-9d80-8ac68b6bf81a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.774906] env[68964]: DEBUG nova.compute.provider_tree [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2153.783007] env[68964]: DEBUG nova.scheduler.client.report [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2153.797418] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.296s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2153.797861] env[68964]: ERROR nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2153.797861] env[68964]: Faults: ['InvalidArgument'] [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] Traceback (most recent call last): [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] self.driver.spawn(context, instance, image_meta, [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] self._fetch_image_if_missing(context, vi) [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] image_cache(vi, tmp_image_ds_loc) [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] vm_util.copy_virtual_disk( [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] session._wait_for_task(vmdk_copy_task) [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] return self.wait_for_task(task_ref) [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] return evt.wait() [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] result = hub.switch() [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] return self.greenlet.switch() [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] self.f(*self.args, **self.kw) [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] raise exceptions.translate_fault(task_info.error) [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] Faults: ['InvalidArgument'] [ 2153.797861] env[68964]: ERROR nova.compute.manager [instance: 8f94d3c8-4674-463d-8829-68a184967183] [ 2153.798780] env[68964]: DEBUG nova.compute.utils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2153.800123] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Build of instance 8f94d3c8-4674-463d-8829-68a184967183 was re-scheduled: A specified parameter was not correct: fileType [ 2153.800123] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2153.800545] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2153.800719] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2153.800895] env[68964]: DEBUG nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2153.801136] env[68964]: DEBUG nova.network.neutron [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2154.131583] env[68964]: DEBUG nova.network.neutron [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2154.146622] env[68964]: INFO nova.compute.manager [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Took 0.35 seconds to deallocate network for instance. [ 2154.237122] env[68964]: INFO nova.scheduler.client.report [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Deleted allocations for instance 8f94d3c8-4674-463d-8829-68a184967183 [ 2154.258041] env[68964]: DEBUG oslo_concurrency.lockutils [None req-7280ffa9-32ad-4213-ac0c-d38ee029f8ea tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "8f94d3c8-4674-463d-8829-68a184967183" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 675.513s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2154.258332] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "8f94d3c8-4674-463d-8829-68a184967183" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 479.861s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2154.258549] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "8f94d3c8-4674-463d-8829-68a184967183-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2154.258753] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "8f94d3c8-4674-463d-8829-68a184967183-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2154.258943] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "8f94d3c8-4674-463d-8829-68a184967183-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2154.261207] env[68964]: INFO nova.compute.manager [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Terminating instance [ 2154.263032] env[68964]: DEBUG nova.compute.manager [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2154.263194] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2154.263685] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-95f92831-b818-4f04-895e-43c79fe28485 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2154.273739] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cb8f4d4-5ceb-4c11-b299-e28b8cbb0a7b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2154.303150] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8f94d3c8-4674-463d-8829-68a184967183 could not be found. [ 2154.303368] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2154.303546] env[68964]: INFO nova.compute.manager [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2154.303780] env[68964]: DEBUG oslo.service.loopingcall [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2154.304017] env[68964]: DEBUG nova.compute.manager [-] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2154.304120] env[68964]: DEBUG nova.network.neutron [-] [instance: 8f94d3c8-4674-463d-8829-68a184967183] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2154.328078] env[68964]: DEBUG nova.network.neutron [-] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2154.335704] env[68964]: INFO nova.compute.manager [-] [instance: 8f94d3c8-4674-463d-8829-68a184967183] Took 0.03 seconds to deallocate network for instance. [ 2154.427492] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e7aa5adb-562f-4406-9e4e-1b47733b1a6b tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "8f94d3c8-4674-463d-8829-68a184967183" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2154.428496] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "8f94d3c8-4674-463d-8829-68a184967183" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 301.605s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2154.428747] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8f94d3c8-4674-463d-8829-68a184967183] During sync_power_state the instance has a pending task (deleting). Skip. [ 2154.428880] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "8f94d3c8-4674-463d-8829-68a184967183" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2180.733521] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2180.733826] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2180.733826] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2180.754141] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2180.754307] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2180.754435] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2180.754562] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2180.754690] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2180.754813] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2180.754933] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2180.755065] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2180.755187] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2180.755309] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2180.755852] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2180.756056] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2180.756200] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2181.724429] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2182.719607] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2182.739999] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2184.725191] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2185.725429] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2186.720659] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2186.724251] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2186.736988] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2186.737291] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2186.737374] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2186.737525] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2186.738728] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e330033-d9b2-4e95-8641-e755adb708dc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.747404] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9474e69-8e74-4257-821a-fe5fd5d8e6de {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.761088] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a5c3453-b1d8-4f4c-9408-2e8cf6b59aee {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.767010] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82db1ef5-3150-41a9-99af-c2e78d8878bc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2186.795489] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180922MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2186.795626] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2186.795809] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2186.860474] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2186.860633] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2186.860763] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2186.860887] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2186.861016] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2186.861138] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2186.861254] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance be9830e6-1e07-443b-b08e-cefac29e2e5c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2186.861369] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f4fdc36a-1a04-46ac-84ad-a6a05ae64e61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2186.861482] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dcd4de94-0433-416d-a9f6-c24f584a80ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2186.861661] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2186.861794] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2186.878276] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Refreshing inventories for resource provider 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2186.890683] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Updating ProviderTree inventory for provider 63b0294e-f555-48a6-a542-3466427066a9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2186.890858] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Updating inventory in ProviderTree for provider 63b0294e-f555-48a6-a542-3466427066a9 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2186.900448] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Refreshing aggregate associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, aggregates: None {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2186.916271] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Refreshing trait associations for resource provider 63b0294e-f555-48a6-a542-3466427066a9, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68964) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2187.012779] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-134caa72-fc56-4f14-89db-d6030c9a4d65 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.020320] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-225b5ea9-1cba-4eeb-9556-bf79bb787187 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.051062] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6964359c-daf8-4c79-a7ab-162c8b218732 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.057890] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61631785-ab84-4e21-acec-9f0a06cc9bd9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2187.070750] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2187.078899] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2187.093558] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2187.093740] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.298s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2202.387419] env[68964]: WARNING oslo_vmware.rw_handles [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2202.387419] env[68964]: ERROR oslo_vmware.rw_handles [ 2202.388067] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/d4f49c21-02fc-4020-a5ac-9d2298c57599/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2202.389850] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2202.390102] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Copying Virtual Disk [datastore1] vmware_temp/d4f49c21-02fc-4020-a5ac-9d2298c57599/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/d4f49c21-02fc-4020-a5ac-9d2298c57599/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2202.390387] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-81143224-e56f-4f0c-94ad-cb1f78b01db4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.398209] env[68964]: DEBUG oslo_vmware.api [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Waiting for the task: (returnval){ [ 2202.398209] env[68964]: value = "task-3431791" [ 2202.398209] env[68964]: _type = "Task" [ 2202.398209] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2202.405665] env[68964]: DEBUG oslo_vmware.api [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Task: {'id': task-3431791, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2202.908827] env[68964]: DEBUG oslo_vmware.exceptions [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2202.909145] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2202.909719] env[68964]: ERROR nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2202.909719] env[68964]: Faults: ['InvalidArgument'] [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Traceback (most recent call last): [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] yield resources [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] self.driver.spawn(context, instance, image_meta, [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] self._fetch_image_if_missing(context, vi) [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] image_cache(vi, tmp_image_ds_loc) [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] vm_util.copy_virtual_disk( [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] session._wait_for_task(vmdk_copy_task) [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] return self.wait_for_task(task_ref) [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] return evt.wait() [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] result = hub.switch() [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] return self.greenlet.switch() [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] self.f(*self.args, **self.kw) [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] raise exceptions.translate_fault(task_info.error) [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Faults: ['InvalidArgument'] [ 2202.909719] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] [ 2202.910626] env[68964]: INFO nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Terminating instance [ 2202.911594] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2202.911800] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2202.912055] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4a4a7c96-50ae-49a6-a0bd-9af6a74f7e07 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.914453] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2202.914620] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2202.915348] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24e34265-7464-4cf5-93dd-67be3cdfc595 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.922250] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2202.922470] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-de2e96f2-ad1b-446f-b950-f6afa1ce0590 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.924630] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2202.924798] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2202.925726] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bc716325-6cba-4cb4-b5d9-5d9bf00d2eb3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.930579] env[68964]: DEBUG oslo_vmware.api [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for the task: (returnval){ [ 2202.930579] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5266d259-7d19-f836-bdb9-29d97f4f8826" [ 2202.930579] env[68964]: _type = "Task" [ 2202.930579] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2202.939580] env[68964]: DEBUG oslo_vmware.api [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]5266d259-7d19-f836-bdb9-29d97f4f8826, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2202.988821] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2202.989073] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2202.989304] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Deleting the datastore file [datastore1] a8d43f08-4cf1-40aa-ad31-2b02b70d6229 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2202.989579] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e8febc9d-c367-4a00-8184-0bf152a26be7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.995977] env[68964]: DEBUG oslo_vmware.api [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Waiting for the task: (returnval){ [ 2202.995977] env[68964]: value = "task-3431793" [ 2202.995977] env[68964]: _type = "Task" [ 2202.995977] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2203.003303] env[68964]: DEBUG oslo_vmware.api [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Task: {'id': task-3431793, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2203.440415] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2203.440741] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Creating directory with path [datastore1] vmware_temp/77210aef-b273-46fc-99cc-cf654f4921fc/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2203.440988] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aecc03d3-7022-4fa6-ac36-aad77d0e1b6a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.452159] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Created directory with path [datastore1] vmware_temp/77210aef-b273-46fc-99cc-cf654f4921fc/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2203.452362] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Fetch image to [datastore1] vmware_temp/77210aef-b273-46fc-99cc-cf654f4921fc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2203.452530] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/77210aef-b273-46fc-99cc-cf654f4921fc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2203.453262] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ede535df-b52f-4a97-b718-f6edda93e518 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.460905] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6915ed28-a54a-4026-90db-f7f0cd755c11 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.469687] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-975c358f-8251-46d2-b21e-44459b95b06a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.502259] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b8859c9-b956-4d06-b644-01a73a3937db {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.508769] env[68964]: DEBUG oslo_vmware.api [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Task: {'id': task-3431793, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075327} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2203.510152] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2203.510341] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2203.510513] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2203.510686] env[68964]: INFO nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2203.512693] env[68964]: DEBUG nova.compute.claims [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2203.512859] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2203.513089] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2203.515492] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-09e64ee9-2ec0-4305-acea-fafc007ead6a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.539123] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2203.657220] env[68964]: DEBUG oslo_vmware.rw_handles [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/77210aef-b273-46fc-99cc-cf654f4921fc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2203.717110] env[68964]: DEBUG oslo_vmware.rw_handles [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2203.717311] env[68964]: DEBUG oslo_vmware.rw_handles [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/77210aef-b273-46fc-99cc-cf654f4921fc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2203.765885] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bb27d17-976a-4d5d-808e-1af73514fc6f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.773397] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54280e36-88fc-429e-866d-8b9a362d6eba {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.804127] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd426da7-732a-4bb5-8499-99d5085257a0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.810732] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-941ce532-ab33-4569-a554-20798e4c4c02 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.823610] env[68964]: DEBUG nova.compute.provider_tree [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2203.831755] env[68964]: DEBUG nova.scheduler.client.report [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2203.845718] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.333s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2203.846255] env[68964]: ERROR nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2203.846255] env[68964]: Faults: ['InvalidArgument'] [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Traceback (most recent call last): [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] self.driver.spawn(context, instance, image_meta, [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] self._fetch_image_if_missing(context, vi) [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] image_cache(vi, tmp_image_ds_loc) [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] vm_util.copy_virtual_disk( [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] session._wait_for_task(vmdk_copy_task) [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] return self.wait_for_task(task_ref) [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] return evt.wait() [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] result = hub.switch() [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] return self.greenlet.switch() [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] self.f(*self.args, **self.kw) [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] raise exceptions.translate_fault(task_info.error) [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Faults: ['InvalidArgument'] [ 2203.846255] env[68964]: ERROR nova.compute.manager [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] [ 2203.847015] env[68964]: DEBUG nova.compute.utils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2203.848360] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Build of instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 was re-scheduled: A specified parameter was not correct: fileType [ 2203.848360] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2203.848724] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2203.848908] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2203.849129] env[68964]: DEBUG nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2203.849306] env[68964]: DEBUG nova.network.neutron [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2204.237358] env[68964]: DEBUG nova.network.neutron [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2204.251520] env[68964]: INFO nova.compute.manager [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Took 0.40 seconds to deallocate network for instance. [ 2204.340591] env[68964]: INFO nova.scheduler.client.report [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Deleted allocations for instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 [ 2204.361204] env[68964]: DEBUG oslo_concurrency.lockutils [None req-79b09d08-c361-45f9-a614-683e9aa08774 tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 670.500s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2204.361379] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 474.368s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2204.361673] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Acquiring lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2204.361814] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2204.361974] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2204.364229] env[68964]: INFO nova.compute.manager [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Terminating instance [ 2204.365995] env[68964]: DEBUG nova.compute.manager [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2204.366217] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2204.366692] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d8c3a9ae-062a-4560-afaf-97b8aadc96f9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.376022] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5d647f9-b608-4d0f-a239-d6e5fed6a8fb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.404642] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a8d43f08-4cf1-40aa-ad31-2b02b70d6229 could not be found. [ 2204.404860] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2204.405049] env[68964]: INFO nova.compute.manager [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2204.405297] env[68964]: DEBUG oslo.service.loopingcall [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2204.405515] env[68964]: DEBUG nova.compute.manager [-] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2204.405611] env[68964]: DEBUG nova.network.neutron [-] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2204.430037] env[68964]: DEBUG nova.network.neutron [-] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2204.438056] env[68964]: INFO nova.compute.manager [-] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] Took 0.03 seconds to deallocate network for instance. [ 2204.524743] env[68964]: DEBUG oslo_concurrency.lockutils [None req-8d05a69a-8431-4489-a547-05773ab5b0cd tempest-ServerPasswordTestJSON-321738188 tempest-ServerPasswordTestJSON-321738188-project-member] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.163s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2204.525623] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 351.702s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2204.525860] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a8d43f08-4cf1-40aa-ad31-2b02b70d6229] During sync_power_state the instance has a pending task (deleting). Skip. [ 2204.526120] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "a8d43f08-4cf1-40aa-ad31-2b02b70d6229" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2209.606711] env[68964]: DEBUG oslo_concurrency.lockutils [None req-acd29092-e60f-4f93-b7e8-fc0943c0bb29 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "be9830e6-1e07-443b-b08e-cefac29e2e5c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2241.094735] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2241.095162] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2241.095162] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2241.113039] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2241.113198] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2241.113330] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2241.113452] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2241.113583] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2241.113695] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2241.113809] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2241.113964] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2241.114057] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2241.114502] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2241.724554] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2241.724781] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2243.725579] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2243.725916] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2244.725554] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2246.719596] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2247.724578] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2247.724904] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2247.735736] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2247.735958] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2247.736145] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2247.736303] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2247.737482] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57b69e5d-e7dc-4053-849e-e8759435bf71 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.746240] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8e0143f-7e97-4669-b59f-5e140647c362 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.759917] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f6d0c4a-a829-4d27-9fde-0cfa8abbb1a7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.766036] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf6ee94b-7d0a-4237-abee-41de199b8642 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.796243] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180902MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2247.796357] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2247.796553] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2247.856662] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2247.856829] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2247.856961] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2247.857101] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2247.857225] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2247.857342] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance be9830e6-1e07-443b-b08e-cefac29e2e5c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2247.857469] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f4fdc36a-1a04-46ac-84ad-a6a05ae64e61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2247.857587] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dcd4de94-0433-416d-a9f6-c24f584a80ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2247.857768] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2247.857904] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2247.952184] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62b9b53b-8429-497c-923b-f05657e830c1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.959692] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5457d1b-d004-4fd2-a648-d90307940688 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.989101] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-add8476f-f2f5-42f6-a640-0db66a26ca9a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2247.995612] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91db7d5d-8541-44d3-b836-46fd8566eddc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2248.008082] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2248.016170] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2248.031255] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2248.031436] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.235s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2253.447721] env[68964]: WARNING oslo_vmware.rw_handles [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2253.447721] env[68964]: ERROR oslo_vmware.rw_handles [ 2253.448313] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/77210aef-b273-46fc-99cc-cf654f4921fc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2253.450140] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2253.450381] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Copying Virtual Disk [datastore1] vmware_temp/77210aef-b273-46fc-99cc-cf654f4921fc/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/77210aef-b273-46fc-99cc-cf654f4921fc/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2253.450669] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-36814958-67b5-4a86-aa4a-376c48c1b1be {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.458276] env[68964]: DEBUG oslo_vmware.api [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for the task: (returnval){ [ 2253.458276] env[68964]: value = "task-3431794" [ 2253.458276] env[68964]: _type = "Task" [ 2253.458276] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2253.467582] env[68964]: DEBUG oslo_vmware.api [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': task-3431794, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2253.971165] env[68964]: DEBUG oslo_vmware.exceptions [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2253.971544] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2253.972453] env[68964]: ERROR nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2253.972453] env[68964]: Faults: ['InvalidArgument'] [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Traceback (most recent call last): [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] yield resources [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] self.driver.spawn(context, instance, image_meta, [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] self._fetch_image_if_missing(context, vi) [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] image_cache(vi, tmp_image_ds_loc) [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] vm_util.copy_virtual_disk( [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] session._wait_for_task(vmdk_copy_task) [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] return self.wait_for_task(task_ref) [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] return evt.wait() [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] result = hub.switch() [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] return self.greenlet.switch() [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] self.f(*self.args, **self.kw) [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] raise exceptions.translate_fault(task_info.error) [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Faults: ['InvalidArgument'] [ 2253.972453] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] [ 2253.973297] env[68964]: INFO nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Terminating instance [ 2253.975007] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2253.975378] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2253.975615] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6fe7eda4-0dee-4fd6-a788-db3fcb6308ae {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.978761] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2253.979102] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2253.980092] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b4ce973-bf0d-410f-b615-b669e3f9c810 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.989537] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2253.990642] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-39bdfd84-08a9-4842-899f-df1f41684334 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.992249] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2253.992624] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2253.993096] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d84a8bfe-2464-4ea8-8a00-65d6a0ad794c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.998811] env[68964]: DEBUG oslo_vmware.api [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 2253.998811] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]522ea9e4-156c-68eb-f640-e4209c8c2350" [ 2253.998811] env[68964]: _type = "Task" [ 2253.998811] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2254.005407] env[68964]: DEBUG oslo_vmware.api [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]522ea9e4-156c-68eb-f640-e4209c8c2350, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2254.058400] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2254.058613] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2254.058787] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Deleting the datastore file [datastore1] f94037f2-5dea-4824-9f2d-0f87684ccdb8 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2254.059059] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-50ea2fe9-269c-4e15-af0b-91adb123b8a2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.064963] env[68964]: DEBUG oslo_vmware.api [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for the task: (returnval){ [ 2254.064963] env[68964]: value = "task-3431796" [ 2254.064963] env[68964]: _type = "Task" [ 2254.064963] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2254.072248] env[68964]: DEBUG oslo_vmware.api [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': task-3431796, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2254.508664] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2254.509073] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating directory with path [datastore1] vmware_temp/c57fd174-0beb-43b1-932a-11b5ec9da411/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2254.509175] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ecf7e72a-668f-4809-b286-afe9a509626a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.520411] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Created directory with path [datastore1] vmware_temp/c57fd174-0beb-43b1-932a-11b5ec9da411/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2254.520620] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Fetch image to [datastore1] vmware_temp/c57fd174-0beb-43b1-932a-11b5ec9da411/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2254.520786] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/c57fd174-0beb-43b1-932a-11b5ec9da411/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2254.521587] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30376f1f-ff88-4eb0-b34e-2948db6144ee {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.528155] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d32ff88e-e4fa-4412-9b9a-167989f25886 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.537226] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee98241f-2ff8-41d7-a6bb-e5cda2490a28 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.578103] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b75f5a4-9591-49de-ae5f-1c0da4a0d569 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.586532] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-95747bd2-856c-4397-8f34-8372faf45ba6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.588191] env[68964]: DEBUG oslo_vmware.api [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Task: {'id': task-3431796, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085954} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2254.588434] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2254.588612] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2254.588777] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2254.588944] env[68964]: INFO nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2254.591037] env[68964]: DEBUG nova.compute.claims [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2254.591206] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2254.591409] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2254.612144] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2254.662711] env[68964]: DEBUG oslo_vmware.rw_handles [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c57fd174-0beb-43b1-932a-11b5ec9da411/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2254.722554] env[68964]: DEBUG oslo_vmware.rw_handles [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2254.722740] env[68964]: DEBUG oslo_vmware.rw_handles [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c57fd174-0beb-43b1-932a-11b5ec9da411/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2254.780201] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e51563b6-3aa5-4657-a6fe-bedf51b5077d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.787610] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7da8d21-4c4d-4dd2-8f9a-16d4e141fd5b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.817625] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0df2c381-f838-4915-b141-554194863d34 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.824368] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f7779cd-f4ca-4ae6-9b5a-730ae46185f6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.836923] env[68964]: DEBUG nova.compute.provider_tree [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2254.846777] env[68964]: DEBUG nova.scheduler.client.report [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2254.859458] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.268s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2254.859971] env[68964]: ERROR nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2254.859971] env[68964]: Faults: ['InvalidArgument'] [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Traceback (most recent call last): [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] self.driver.spawn(context, instance, image_meta, [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] self._fetch_image_if_missing(context, vi) [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] image_cache(vi, tmp_image_ds_loc) [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] vm_util.copy_virtual_disk( [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] session._wait_for_task(vmdk_copy_task) [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] return self.wait_for_task(task_ref) [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] return evt.wait() [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] result = hub.switch() [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] return self.greenlet.switch() [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] self.f(*self.args, **self.kw) [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] raise exceptions.translate_fault(task_info.error) [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Faults: ['InvalidArgument'] [ 2254.859971] env[68964]: ERROR nova.compute.manager [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] [ 2254.860742] env[68964]: DEBUG nova.compute.utils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2254.861909] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Build of instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 was re-scheduled: A specified parameter was not correct: fileType [ 2254.861909] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2254.862296] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2254.862468] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2254.862635] env[68964]: DEBUG nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2254.862812] env[68964]: DEBUG nova.network.neutron [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2255.206790] env[68964]: DEBUG nova.network.neutron [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2255.220784] env[68964]: INFO nova.compute.manager [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Took 0.36 seconds to deallocate network for instance. [ 2255.315217] env[68964]: INFO nova.scheduler.client.report [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Deleted allocations for instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 [ 2255.337684] env[68964]: DEBUG oslo_concurrency.lockutils [None req-5a6843a1-36ee-4049-acb1-1eee781a4e9d tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 640.492s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2255.337684] env[68964]: DEBUG oslo_concurrency.lockutils [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 444.229s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2255.337684] env[68964]: DEBUG oslo_concurrency.lockutils [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Acquiring lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2255.337684] env[68964]: DEBUG oslo_concurrency.lockutils [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2255.337684] env[68964]: DEBUG oslo_concurrency.lockutils [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2255.339326] env[68964]: INFO nova.compute.manager [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Terminating instance [ 2255.341656] env[68964]: DEBUG nova.compute.manager [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2255.341860] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2255.342377] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2661f75b-a380-4287-b13d-6957cc4e2053 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.351466] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-736ddb09-d110-4ce9-9606-aee04ea983c6 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2255.378598] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f94037f2-5dea-4824-9f2d-0f87684ccdb8 could not be found. [ 2255.378803] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2255.378972] env[68964]: INFO nova.compute.manager [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2255.379273] env[68964]: DEBUG oslo.service.loopingcall [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2255.379496] env[68964]: DEBUG nova.compute.manager [-] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2255.379592] env[68964]: DEBUG nova.network.neutron [-] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2255.409711] env[68964]: DEBUG nova.network.neutron [-] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2255.420109] env[68964]: INFO nova.compute.manager [-] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] Took 0.04 seconds to deallocate network for instance. [ 2255.506108] env[68964]: DEBUG oslo_concurrency.lockutils [None req-23ea0956-c2d0-481d-9749-32006f6a09d8 tempest-AttachVolumeShelveTestJSON-262628643 tempest-AttachVolumeShelveTestJSON-262628643-project-member] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2255.506915] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 402.683s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2255.507117] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f94037f2-5dea-4824-9f2d-0f87684ccdb8] During sync_power_state the instance has a pending task (deleting). Skip. [ 2255.507299] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "f94037f2-5dea-4824-9f2d-0f87684ccdb8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2256.195165] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "96d01266-87ae-4bb5-a047-c81dd74c0f24" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2256.195476] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "96d01266-87ae-4bb5-a047-c81dd74c0f24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2256.206350] env[68964]: DEBUG nova.compute.manager [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2256.254611] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2256.254865] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2256.256383] env[68964]: INFO nova.compute.claims [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2256.394926] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f148a51-bba4-4c13-9d15-e821ed2a7456 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.403971] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80dd467a-ebfa-4874-8974-71471a810264 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.434702] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24cd83ba-3346-4b36-bb5b-c60f95b44040 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.442292] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8b68115-4897-4a9b-9ec1-d6300204fb77 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.456987] env[68964]: DEBUG nova.compute.provider_tree [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2256.466435] env[68964]: DEBUG nova.scheduler.client.report [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2256.483493] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2256.484082] env[68964]: DEBUG nova.compute.manager [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2256.522947] env[68964]: DEBUG nova.compute.utils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2256.524182] env[68964]: DEBUG nova.compute.manager [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2256.524363] env[68964]: DEBUG nova.network.neutron [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2256.532480] env[68964]: DEBUG nova.compute.manager [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2256.585226] env[68964]: DEBUG nova.policy [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae5a60881ac14c52b769561e6f81d6ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6087614d846942ddbd06308568d3f1d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2256.596795] env[68964]: DEBUG nova.compute.manager [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2256.620447] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2256.620695] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2256.620849] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2256.621034] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2256.621186] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2256.621328] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2256.621530] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2256.621685] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2256.622017] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2256.622127] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2256.622362] env[68964]: DEBUG nova.virt.hardware [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2256.623233] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c44df3c2-af7f-460c-aafa-90cd6a62f272 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.631175] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-134f877e-2dd4-4570-a5c9-763574fa766a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2257.078502] env[68964]: DEBUG nova.network.neutron [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Successfully created port: d8282ccf-8cb7-4343-9c56-031de9d7494d {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2257.667618] env[68964]: DEBUG nova.compute.manager [req-73dc7c02-1602-44d8-ba60-2478c71f622f req-52dfa766-c32d-4aa4-b4fe-68dac4b713ae service nova] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Received event network-vif-plugged-d8282ccf-8cb7-4343-9c56-031de9d7494d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2257.667872] env[68964]: DEBUG oslo_concurrency.lockutils [req-73dc7c02-1602-44d8-ba60-2478c71f622f req-52dfa766-c32d-4aa4-b4fe-68dac4b713ae service nova] Acquiring lock "96d01266-87ae-4bb5-a047-c81dd74c0f24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2257.668158] env[68964]: DEBUG oslo_concurrency.lockutils [req-73dc7c02-1602-44d8-ba60-2478c71f622f req-52dfa766-c32d-4aa4-b4fe-68dac4b713ae service nova] Lock "96d01266-87ae-4bb5-a047-c81dd74c0f24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2257.668232] env[68964]: DEBUG oslo_concurrency.lockutils [req-73dc7c02-1602-44d8-ba60-2478c71f622f req-52dfa766-c32d-4aa4-b4fe-68dac4b713ae service nova] Lock "96d01266-87ae-4bb5-a047-c81dd74c0f24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2257.668399] env[68964]: DEBUG nova.compute.manager [req-73dc7c02-1602-44d8-ba60-2478c71f622f req-52dfa766-c32d-4aa4-b4fe-68dac4b713ae service nova] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] No waiting events found dispatching network-vif-plugged-d8282ccf-8cb7-4343-9c56-031de9d7494d {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2257.668554] env[68964]: WARNING nova.compute.manager [req-73dc7c02-1602-44d8-ba60-2478c71f622f req-52dfa766-c32d-4aa4-b4fe-68dac4b713ae service nova] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Received unexpected event network-vif-plugged-d8282ccf-8cb7-4343-9c56-031de9d7494d for instance with vm_state building and task_state spawning. [ 2257.756552] env[68964]: DEBUG nova.network.neutron [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Successfully updated port: d8282ccf-8cb7-4343-9c56-031de9d7494d {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2257.768485] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "refresh_cache-96d01266-87ae-4bb5-a047-c81dd74c0f24" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2257.768631] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired lock "refresh_cache-96d01266-87ae-4bb5-a047-c81dd74c0f24" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2257.768781] env[68964]: DEBUG nova.network.neutron [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2257.805181] env[68964]: DEBUG nova.network.neutron [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2257.962493] env[68964]: DEBUG nova.network.neutron [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Updating instance_info_cache with network_info: [{"id": "d8282ccf-8cb7-4343-9c56-031de9d7494d", "address": "fa:16:3e:c4:90:a5", "network": {"id": "16e206f1-9c4c-4ce3-bd55-aeda5cfaeee7", "bridge": "br-int", "label": "tempest-ServersTestJSON-94782596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6087614d846942ddbd06308568d3f1d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec763be6-4041-4651-8fd7-3820cf0ab86d", "external-id": "nsx-vlan-transportzone-943", "segmentation_id": 943, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8282ccf-8c", "ovs_interfaceid": "d8282ccf-8cb7-4343-9c56-031de9d7494d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2257.974524] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Releasing lock "refresh_cache-96d01266-87ae-4bb5-a047-c81dd74c0f24" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2257.974806] env[68964]: DEBUG nova.compute.manager [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Instance network_info: |[{"id": "d8282ccf-8cb7-4343-9c56-031de9d7494d", "address": "fa:16:3e:c4:90:a5", "network": {"id": "16e206f1-9c4c-4ce3-bd55-aeda5cfaeee7", "bridge": "br-int", "label": "tempest-ServersTestJSON-94782596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6087614d846942ddbd06308568d3f1d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec763be6-4041-4651-8fd7-3820cf0ab86d", "external-id": "nsx-vlan-transportzone-943", "segmentation_id": 943, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8282ccf-8c", "ovs_interfaceid": "d8282ccf-8cb7-4343-9c56-031de9d7494d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2257.975216] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c4:90:a5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ec763be6-4041-4651-8fd7-3820cf0ab86d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd8282ccf-8cb7-4343-9c56-031de9d7494d', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2257.982674] env[68964]: DEBUG oslo.service.loopingcall [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2257.983144] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2257.983374] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d788e3a3-0c1e-4958-9c9c-2e3d7a853764 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2258.003670] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2258.003670] env[68964]: value = "task-3431797" [ 2258.003670] env[68964]: _type = "Task" [ 2258.003670] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2258.011476] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431797, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2258.513798] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431797, 'name': CreateVM_Task, 'duration_secs': 0.465384} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2258.513959] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2258.514637] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2258.514802] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2258.515137] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2258.515388] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4a096295-1676-4e16-8646-24b58cae0235 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2258.519669] env[68964]: DEBUG oslo_vmware.api [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for the task: (returnval){ [ 2258.519669] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52f3a8c5-74b3-667a-1350-2bc964f3467b" [ 2258.519669] env[68964]: _type = "Task" [ 2258.519669] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2258.527214] env[68964]: DEBUG oslo_vmware.api [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52f3a8c5-74b3-667a-1350-2bc964f3467b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2259.030394] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2259.030697] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2259.030865] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a9da2d3d-56c8-4ee7-8ff9-c7eb0bc74e9a tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2259.694347] env[68964]: DEBUG nova.compute.manager [req-d0a01d8f-67a0-4d92-9502-0f5f6d8a0294 req-b8e0a4dc-e230-4fcb-92d4-58643e3c9909 service nova] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Received event network-changed-d8282ccf-8cb7-4343-9c56-031de9d7494d {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2259.695010] env[68964]: DEBUG nova.compute.manager [req-d0a01d8f-67a0-4d92-9502-0f5f6d8a0294 req-b8e0a4dc-e230-4fcb-92d4-58643e3c9909 service nova] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Refreshing instance network info cache due to event network-changed-d8282ccf-8cb7-4343-9c56-031de9d7494d. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2259.695010] env[68964]: DEBUG oslo_concurrency.lockutils [req-d0a01d8f-67a0-4d92-9502-0f5f6d8a0294 req-b8e0a4dc-e230-4fcb-92d4-58643e3c9909 service nova] Acquiring lock "refresh_cache-96d01266-87ae-4bb5-a047-c81dd74c0f24" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2259.695010] env[68964]: DEBUG oslo_concurrency.lockutils [req-d0a01d8f-67a0-4d92-9502-0f5f6d8a0294 req-b8e0a4dc-e230-4fcb-92d4-58643e3c9909 service nova] Acquired lock "refresh_cache-96d01266-87ae-4bb5-a047-c81dd74c0f24" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2259.695151] env[68964]: DEBUG nova.network.neutron [req-d0a01d8f-67a0-4d92-9502-0f5f6d8a0294 req-b8e0a4dc-e230-4fcb-92d4-58643e3c9909 service nova] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Refreshing network info cache for port d8282ccf-8cb7-4343-9c56-031de9d7494d {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2259.937486] env[68964]: DEBUG nova.network.neutron [req-d0a01d8f-67a0-4d92-9502-0f5f6d8a0294 req-b8e0a4dc-e230-4fcb-92d4-58643e3c9909 service nova] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Updated VIF entry in instance network info cache for port d8282ccf-8cb7-4343-9c56-031de9d7494d. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2259.938092] env[68964]: DEBUG nova.network.neutron [req-d0a01d8f-67a0-4d92-9502-0f5f6d8a0294 req-b8e0a4dc-e230-4fcb-92d4-58643e3c9909 service nova] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Updating instance_info_cache with network_info: [{"id": "d8282ccf-8cb7-4343-9c56-031de9d7494d", "address": "fa:16:3e:c4:90:a5", "network": {"id": "16e206f1-9c4c-4ce3-bd55-aeda5cfaeee7", "bridge": "br-int", "label": "tempest-ServersTestJSON-94782596-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6087614d846942ddbd06308568d3f1d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ec763be6-4041-4651-8fd7-3820cf0ab86d", "external-id": "nsx-vlan-transportzone-943", "segmentation_id": 943, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8282ccf-8c", "ovs_interfaceid": "d8282ccf-8cb7-4343-9c56-031de9d7494d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2259.948293] env[68964]: DEBUG oslo_concurrency.lockutils [req-d0a01d8f-67a0-4d92-9502-0f5f6d8a0294 req-b8e0a4dc-e230-4fcb-92d4-58643e3c9909 service nova] Releasing lock "refresh_cache-96d01266-87ae-4bb5-a047-c81dd74c0f24" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2269.385277] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Acquiring lock "8d45312a-5084-40b7-b4f7-733a2285bb4d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2269.385277] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Lock "8d45312a-5084-40b7-b4f7-733a2285bb4d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2269.394954] env[68964]: DEBUG nova.compute.manager [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2269.442596] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2269.444805] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2269.445401] env[68964]: INFO nova.compute.claims [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2269.643767] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35ad5289-1e1f-4358-ac14-248c85461449 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2269.651577] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e44e80e0-0371-45eb-a7b2-727aef00d561 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2269.681212] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e70fa84-8123-4f8e-a21a-3f79a304d5b1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2269.688480] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c87fc6c-549b-4a35-9c74-e4879f48bf00 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2269.701530] env[68964]: DEBUG nova.compute.provider_tree [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2269.710082] env[68964]: DEBUG nova.scheduler.client.report [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2269.724987] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2269.725509] env[68964]: DEBUG nova.compute.manager [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2269.758995] env[68964]: DEBUG nova.compute.utils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2269.761597] env[68964]: DEBUG nova.compute.manager [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2269.761597] env[68964]: DEBUG nova.network.neutron [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2269.770940] env[68964]: DEBUG nova.compute.manager [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2269.818671] env[68964]: DEBUG nova.policy [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '24d958d10a464faeae86084b1cc73874', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '91171b16d18941b9af8a011056f80724', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2269.832333] env[68964]: DEBUG nova.compute.manager [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2269.858537] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2269.858774] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2269.858931] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2269.859157] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2269.859313] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2269.859459] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2269.859663] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2269.859819] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2269.859981] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2269.860154] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2269.860323] env[68964]: DEBUG nova.virt.hardware [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2269.861185] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8b41e46-5b9e-4e58-a5e1-32c7f0d0722b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2269.868984] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f05db0c-9cbb-4363-9784-e6ece43e7df4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2270.186095] env[68964]: DEBUG nova.network.neutron [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Successfully created port: 9ff39378-d252-4538-9904-d7c960774741 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2270.765180] env[68964]: DEBUG nova.compute.manager [req-3fcb3ba6-5ad6-418b-a521-3be9c39b8407 req-979479b5-1b53-4d13-8e19-766d8fdb1663 service nova] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Received event network-vif-plugged-9ff39378-d252-4538-9904-d7c960774741 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2270.765444] env[68964]: DEBUG oslo_concurrency.lockutils [req-3fcb3ba6-5ad6-418b-a521-3be9c39b8407 req-979479b5-1b53-4d13-8e19-766d8fdb1663 service nova] Acquiring lock "8d45312a-5084-40b7-b4f7-733a2285bb4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2270.765605] env[68964]: DEBUG oslo_concurrency.lockutils [req-3fcb3ba6-5ad6-418b-a521-3be9c39b8407 req-979479b5-1b53-4d13-8e19-766d8fdb1663 service nova] Lock "8d45312a-5084-40b7-b4f7-733a2285bb4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2270.765771] env[68964]: DEBUG oslo_concurrency.lockutils [req-3fcb3ba6-5ad6-418b-a521-3be9c39b8407 req-979479b5-1b53-4d13-8e19-766d8fdb1663 service nova] Lock "8d45312a-5084-40b7-b4f7-733a2285bb4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2270.765974] env[68964]: DEBUG nova.compute.manager [req-3fcb3ba6-5ad6-418b-a521-3be9c39b8407 req-979479b5-1b53-4d13-8e19-766d8fdb1663 service nova] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] No waiting events found dispatching network-vif-plugged-9ff39378-d252-4538-9904-d7c960774741 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2270.770290] env[68964]: WARNING nova.compute.manager [req-3fcb3ba6-5ad6-418b-a521-3be9c39b8407 req-979479b5-1b53-4d13-8e19-766d8fdb1663 service nova] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Received unexpected event network-vif-plugged-9ff39378-d252-4538-9904-d7c960774741 for instance with vm_state building and task_state spawning. [ 2270.893689] env[68964]: DEBUG nova.network.neutron [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Successfully updated port: 9ff39378-d252-4538-9904-d7c960774741 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2270.908236] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Acquiring lock "refresh_cache-8d45312a-5084-40b7-b4f7-733a2285bb4d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2270.908452] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Acquired lock "refresh_cache-8d45312a-5084-40b7-b4f7-733a2285bb4d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2270.908614] env[68964]: DEBUG nova.network.neutron [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2270.967235] env[68964]: DEBUG nova.network.neutron [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2271.121089] env[68964]: DEBUG nova.network.neutron [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Updating instance_info_cache with network_info: [{"id": "9ff39378-d252-4538-9904-d7c960774741", "address": "fa:16:3e:c9:bc:82", "network": {"id": "e9ba088c-d79a-4713-9ac7-18786867e884", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-72095980-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "91171b16d18941b9af8a011056f80724", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d182e8eb-3f6d-4c76-a06e-133dd9b3cd30", "external-id": "nsx-vlan-transportzone-260", "segmentation_id": 260, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9ff39378-d2", "ovs_interfaceid": "9ff39378-d252-4538-9904-d7c960774741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2271.131791] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Releasing lock "refresh_cache-8d45312a-5084-40b7-b4f7-733a2285bb4d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2271.132077] env[68964]: DEBUG nova.compute.manager [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Instance network_info: |[{"id": "9ff39378-d252-4538-9904-d7c960774741", "address": "fa:16:3e:c9:bc:82", "network": {"id": "e9ba088c-d79a-4713-9ac7-18786867e884", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-72095980-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "91171b16d18941b9af8a011056f80724", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d182e8eb-3f6d-4c76-a06e-133dd9b3cd30", "external-id": "nsx-vlan-transportzone-260", "segmentation_id": 260, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9ff39378-d2", "ovs_interfaceid": "9ff39378-d252-4538-9904-d7c960774741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2271.132450] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c9:bc:82', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd182e8eb-3f6d-4c76-a06e-133dd9b3cd30', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9ff39378-d252-4538-9904-d7c960774741', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2271.139819] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Creating folder: Project (91171b16d18941b9af8a011056f80724). Parent ref: group-v684465. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2271.140319] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9a95e62b-009c-4d25-bdd6-409aa31c4e0f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2271.153507] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Created folder: Project (91171b16d18941b9af8a011056f80724) in parent group-v684465. [ 2271.153694] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Creating folder: Instances. Parent ref: group-v684613. {{(pid=68964) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2271.153935] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d2bbf26d-3141-481d-8613-0750f82d07ea {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2271.162756] env[68964]: INFO nova.virt.vmwareapi.vm_util [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Created folder: Instances in parent group-v684613. [ 2271.162975] env[68964]: DEBUG oslo.service.loopingcall [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2271.163166] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2271.163352] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8618b149-d8fa-46dc-b91d-76ecd6e0400b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2271.184048] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2271.184048] env[68964]: value = "task-3431800" [ 2271.184048] env[68964]: _type = "Task" [ 2271.184048] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2271.190677] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431800, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2271.692306] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431800, 'name': CreateVM_Task} progress is 99%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2272.193625] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431800, 'name': CreateVM_Task, 'duration_secs': 0.518167} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2272.193928] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2272.194492] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2272.194735] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2272.195059] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2272.195361] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1e7f1df9-42e9-46f2-ba59-147f35dcb1ca {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2272.199857] env[68964]: DEBUG oslo_vmware.api [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Waiting for the task: (returnval){ [ 2272.199857] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a9b04e-e102-f799-0cf0-0ffee74b6630" [ 2272.199857] env[68964]: _type = "Task" [ 2272.199857] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2272.207455] env[68964]: DEBUG oslo_vmware.api [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a9b04e-e102-f799-0cf0-0ffee74b6630, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2272.710350] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2272.711564] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2272.711564] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9e50b5da-08f7-4929-8079-a1c467658bc3 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2272.797252] env[68964]: DEBUG nova.compute.manager [req-dcf7852a-72a4-4217-a35c-a02ed588fe73 req-03c6b7c7-18fc-40fc-9736-4151bf0e44d7 service nova] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Received event network-changed-9ff39378-d252-4538-9904-d7c960774741 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2272.797463] env[68964]: DEBUG nova.compute.manager [req-dcf7852a-72a4-4217-a35c-a02ed588fe73 req-03c6b7c7-18fc-40fc-9736-4151bf0e44d7 service nova] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Refreshing instance network info cache due to event network-changed-9ff39378-d252-4538-9904-d7c960774741. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2272.797634] env[68964]: DEBUG oslo_concurrency.lockutils [req-dcf7852a-72a4-4217-a35c-a02ed588fe73 req-03c6b7c7-18fc-40fc-9736-4151bf0e44d7 service nova] Acquiring lock "refresh_cache-8d45312a-5084-40b7-b4f7-733a2285bb4d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2272.797778] env[68964]: DEBUG oslo_concurrency.lockutils [req-dcf7852a-72a4-4217-a35c-a02ed588fe73 req-03c6b7c7-18fc-40fc-9736-4151bf0e44d7 service nova] Acquired lock "refresh_cache-8d45312a-5084-40b7-b4f7-733a2285bb4d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2272.797934] env[68964]: DEBUG nova.network.neutron [req-dcf7852a-72a4-4217-a35c-a02ed588fe73 req-03c6b7c7-18fc-40fc-9736-4151bf0e44d7 service nova] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Refreshing network info cache for port 9ff39378-d252-4538-9904-d7c960774741 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2273.069780] env[68964]: DEBUG nova.network.neutron [req-dcf7852a-72a4-4217-a35c-a02ed588fe73 req-03c6b7c7-18fc-40fc-9736-4151bf0e44d7 service nova] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Updated VIF entry in instance network info cache for port 9ff39378-d252-4538-9904-d7c960774741. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2273.070149] env[68964]: DEBUG nova.network.neutron [req-dcf7852a-72a4-4217-a35c-a02ed588fe73 req-03c6b7c7-18fc-40fc-9736-4151bf0e44d7 service nova] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Updating instance_info_cache with network_info: [{"id": "9ff39378-d252-4538-9904-d7c960774741", "address": "fa:16:3e:c9:bc:82", "network": {"id": "e9ba088c-d79a-4713-9ac7-18786867e884", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-72095980-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "91171b16d18941b9af8a011056f80724", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d182e8eb-3f6d-4c76-a06e-133dd9b3cd30", "external-id": "nsx-vlan-transportzone-260", "segmentation_id": 260, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9ff39378-d2", "ovs_interfaceid": "9ff39378-d252-4538-9904-d7c960774741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2273.079576] env[68964]: DEBUG oslo_concurrency.lockutils [req-dcf7852a-72a4-4217-a35c-a02ed588fe73 req-03c6b7c7-18fc-40fc-9736-4151bf0e44d7 service nova] Releasing lock "refresh_cache-8d45312a-5084-40b7-b4f7-733a2285bb4d" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2278.853122] env[68964]: DEBUG oslo_concurrency.lockutils [None req-e5e6a76a-9c8f-4550-9231-08142345a5d1 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2294.771125] env[68964]: DEBUG oslo_concurrency.lockutils [None req-1f950ca0-e9cd-48ad-af78-90a904824ec5 tempest-AttachVolumeNegativeTest-415111208 tempest-AttachVolumeNegativeTest-415111208-project-member] Acquiring lock "dcd4de94-0433-416d-a9f6-c24f584a80ad" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2300.441501] env[68964]: WARNING oslo_vmware.rw_handles [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2300.441501] env[68964]: ERROR oslo_vmware.rw_handles [ 2300.442234] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/c57fd174-0beb-43b1-932a-11b5ec9da411/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2300.444094] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2300.444353] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Copying Virtual Disk [datastore1] vmware_temp/c57fd174-0beb-43b1-932a-11b5ec9da411/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/c57fd174-0beb-43b1-932a-11b5ec9da411/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2300.444638] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fe2b8013-2cdf-4dfe-925e-403bc1762dc5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2300.452613] env[68964]: DEBUG oslo_vmware.api [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 2300.452613] env[68964]: value = "task-3431801" [ 2300.452613] env[68964]: _type = "Task" [ 2300.452613] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2300.460722] env[68964]: DEBUG oslo_vmware.api [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': task-3431801, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2300.963194] env[68964]: DEBUG oslo_vmware.exceptions [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2300.963194] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2300.963798] env[68964]: ERROR nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2300.963798] env[68964]: Faults: ['InvalidArgument'] [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Traceback (most recent call last): [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] yield resources [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] self.driver.spawn(context, instance, image_meta, [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] self._fetch_image_if_missing(context, vi) [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] image_cache(vi, tmp_image_ds_loc) [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] vm_util.copy_virtual_disk( [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] session._wait_for_task(vmdk_copy_task) [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] return self.wait_for_task(task_ref) [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] return evt.wait() [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] result = hub.switch() [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] return self.greenlet.switch() [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] self.f(*self.args, **self.kw) [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] raise exceptions.translate_fault(task_info.error) [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Faults: ['InvalidArgument'] [ 2300.963798] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] [ 2300.965027] env[68964]: INFO nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Terminating instance [ 2300.965675] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2300.965884] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2300.966466] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-03ef3ac7-2714-4cae-8f9a-a0f840c89198 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2300.968378] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2300.968575] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2300.969323] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97ee8d33-608b-4902-9bb2-02f8fb200a07 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2300.977361] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2300.977587] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d0e61740-1334-432f-9f7f-267a096f0dc5 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2300.979837] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2300.980015] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2300.980988] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-522b3804-7b96-4fc9-8545-4c05bfba95b8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2300.985866] env[68964]: DEBUG oslo_vmware.api [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Waiting for the task: (returnval){ [ 2300.985866] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52401888-25ec-5c18-36f4-2cf30407796f" [ 2300.985866] env[68964]: _type = "Task" [ 2300.985866] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2300.993584] env[68964]: DEBUG oslo_vmware.api [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52401888-25ec-5c18-36f4-2cf30407796f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2301.031104] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2301.042200] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2301.042421] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2301.042600] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Deleting the datastore file [datastore1] eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2301.042861] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b63042b4-33be-4caa-8d42-fd380710fb4b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.049111] env[68964]: DEBUG oslo_vmware.api [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 2301.049111] env[68964]: value = "task-3431803" [ 2301.049111] env[68964]: _type = "Task" [ 2301.049111] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2301.057236] env[68964]: DEBUG oslo_vmware.api [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': task-3431803, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2301.496309] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2301.496575] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Creating directory with path [datastore1] vmware_temp/a2963065-d6b1-4f02-846f-abbdbfffa92c/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2301.496799] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9f4177c6-60c6-4a77-a62b-000704eae4c9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.508093] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Created directory with path [datastore1] vmware_temp/a2963065-d6b1-4f02-846f-abbdbfffa92c/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2301.508306] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Fetch image to [datastore1] vmware_temp/a2963065-d6b1-4f02-846f-abbdbfffa92c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2301.508481] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/a2963065-d6b1-4f02-846f-abbdbfffa92c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2301.509204] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaab411a-5f87-42dd-bf83-54047a17bf9d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.515437] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c82a5268-989a-4ff5-ad79-ac69dcf265d4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.524308] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a04ee6d-52b8-4399-a31d-3136ab2516cb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.558465] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78b374bb-29b5-4e86-862c-0d6d96be7362 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.565018] env[68964]: DEBUG oslo_vmware.api [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': task-3431803, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072404} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2301.566381] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2301.566571] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2301.566743] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2301.566915] env[68964]: INFO nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2301.568676] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2e52ae5b-e833-4642-b445-fac7d5f0afc0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.570505] env[68964]: DEBUG nova.compute.claims [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2301.570675] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2301.570882] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2301.595584] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2301.723931] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2301.724112] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2301.724166] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2301.742048] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2301.742221] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2301.742354] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2301.742487] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2301.742616] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2301.742714] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2301.742832] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2301.742948] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2301.743222] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2301.743744] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2301.743889] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2301.746787] env[68964]: DEBUG oslo_vmware.rw_handles [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a2963065-d6b1-4f02-846f-abbdbfffa92c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2301.806441] env[68964]: DEBUG oslo_vmware.rw_handles [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2301.806631] env[68964]: DEBUG oslo_vmware.rw_handles [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a2963065-d6b1-4f02-846f-abbdbfffa92c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2301.848070] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a54ac8b-b60f-48e5-9f55-ae480963debc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.855676] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-045f903f-4147-4ec1-bb13-21b2d05815bd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.885046] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af871a5a-658f-49a3-8f71-ec878d21df65 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.891981] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-212bca89-39e4-4dac-8ce2-930397259a8e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.905064] env[68964]: DEBUG nova.compute.provider_tree [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2301.913678] env[68964]: DEBUG nova.scheduler.client.report [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2301.927243] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.356s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2301.927770] env[68964]: ERROR nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2301.927770] env[68964]: Faults: ['InvalidArgument'] [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Traceback (most recent call last): [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] self.driver.spawn(context, instance, image_meta, [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] self._fetch_image_if_missing(context, vi) [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] image_cache(vi, tmp_image_ds_loc) [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] vm_util.copy_virtual_disk( [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] session._wait_for_task(vmdk_copy_task) [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] return self.wait_for_task(task_ref) [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] return evt.wait() [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] result = hub.switch() [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] return self.greenlet.switch() [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] self.f(*self.args, **self.kw) [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] raise exceptions.translate_fault(task_info.error) [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Faults: ['InvalidArgument'] [ 2301.927770] env[68964]: ERROR nova.compute.manager [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] [ 2301.929119] env[68964]: DEBUG nova.compute.utils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2301.930192] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Build of instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 was re-scheduled: A specified parameter was not correct: fileType [ 2301.930192] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2301.930578] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2301.930747] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2301.930914] env[68964]: DEBUG nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2301.931089] env[68964]: DEBUG nova.network.neutron [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2302.272374] env[68964]: DEBUG nova.network.neutron [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2302.283016] env[68964]: INFO nova.compute.manager [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Took 0.35 seconds to deallocate network for instance. [ 2302.371227] env[68964]: INFO nova.scheduler.client.report [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Deleted allocations for instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 [ 2302.395078] env[68964]: DEBUG oslo_concurrency.lockutils [None req-9fb66764-8b9f-49a8-8b74-97c034d98856 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 612.605s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2302.395216] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 449.571s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2302.395433] env[68964]: INFO nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] During sync_power_state the instance has a pending task (spawning). Skip. [ 2302.395646] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2302.396167] env[68964]: DEBUG oslo_concurrency.lockutils [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 416.244s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2302.396412] env[68964]: DEBUG oslo_concurrency.lockutils [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2302.396621] env[68964]: DEBUG oslo_concurrency.lockutils [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2302.396784] env[68964]: DEBUG oslo_concurrency.lockutils [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2302.399034] env[68964]: INFO nova.compute.manager [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Terminating instance [ 2302.402078] env[68964]: DEBUG nova.compute.manager [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2302.402270] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2302.402537] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3a84a6ec-3972-4d58-9edc-857a38338d55 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.416120] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7eca4941-0625-4573-8490-0aedbe3b8913 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.445719] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70 could not be found. [ 2302.445922] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2302.446120] env[68964]: INFO nova.compute.manager [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2302.446395] env[68964]: DEBUG oslo.service.loopingcall [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2302.446630] env[68964]: DEBUG nova.compute.manager [-] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2302.446730] env[68964]: DEBUG nova.network.neutron [-] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2302.475388] env[68964]: DEBUG nova.network.neutron [-] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2302.483905] env[68964]: INFO nova.compute.manager [-] [instance: eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70] Took 0.04 seconds to deallocate network for instance. [ 2302.575130] env[68964]: DEBUG oslo_concurrency.lockutils [None req-85645ffe-4296-43de-8350-1b26aff06a14 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "eb1ee02c-ee1e-49d1-b1f7-b6b586c20c70" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2303.725122] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2304.720148] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2304.739956] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2305.724765] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2307.726063] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2308.719636] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2309.725026] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2309.737610] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2309.737790] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2309.737944] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2309.738113] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2309.739251] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-336140fd-0057-4ec3-bad9-fff9e3870134 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2309.747916] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7a3892f-da3b-446f-9890-013ec535b030 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2309.761519] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1526182-df82-491f-a89a-9ee2450e17b2 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2309.767726] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0fc7b26-d789-43bc-ba57-500c6433bd8d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2309.800650] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180913MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2309.800806] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2309.800985] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2309.865024] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2309.865144] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2309.865245] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2309.865361] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance be9830e6-1e07-443b-b08e-cefac29e2e5c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2309.865480] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f4fdc36a-1a04-46ac-84ad-a6a05ae64e61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2309.865596] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dcd4de94-0433-416d-a9f6-c24f584a80ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2309.865711] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96d01266-87ae-4bb5-a047-c81dd74c0f24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2309.865824] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8d45312a-5084-40b7-b4f7-733a2285bb4d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2309.866013] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2309.866177] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2309.959563] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5afdbf35-04b5-4678-90f1-c68728cf2631 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2309.967481] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6250cbe-4119-4254-85a5-c77f56a25c19 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2309.996990] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74d8ff8-4257-4b57-93e2-bdac82ddc825 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2310.003418] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ed334b1-4faa-40f3-8857-f9113bd009c9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2310.016205] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2310.024421] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2310.040479] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2310.040659] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2346.995136] env[68964]: WARNING oslo_vmware.rw_handles [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2346.995136] env[68964]: ERROR oslo_vmware.rw_handles [ 2346.995803] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/a2963065-d6b1-4f02-846f-abbdbfffa92c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2346.997464] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2346.997709] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Copying Virtual Disk [datastore1] vmware_temp/a2963065-d6b1-4f02-846f-abbdbfffa92c/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/a2963065-d6b1-4f02-846f-abbdbfffa92c/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2346.997983] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-10dd6ac4-1475-4314-9d0f-007708e33faa {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.005845] env[68964]: DEBUG oslo_vmware.api [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Waiting for the task: (returnval){ [ 2347.005845] env[68964]: value = "task-3431804" [ 2347.005845] env[68964]: _type = "Task" [ 2347.005845] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2347.014137] env[68964]: DEBUG oslo_vmware.api [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Task: {'id': task-3431804, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2347.516097] env[68964]: DEBUG oslo_vmware.exceptions [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2347.516432] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2347.516982] env[68964]: ERROR nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2347.516982] env[68964]: Faults: ['InvalidArgument'] [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Traceback (most recent call last): [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] yield resources [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] self.driver.spawn(context, instance, image_meta, [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] self._fetch_image_if_missing(context, vi) [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] image_cache(vi, tmp_image_ds_loc) [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] vm_util.copy_virtual_disk( [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] session._wait_for_task(vmdk_copy_task) [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] return self.wait_for_task(task_ref) [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] return evt.wait() [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] result = hub.switch() [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] return self.greenlet.switch() [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] self.f(*self.args, **self.kw) [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] raise exceptions.translate_fault(task_info.error) [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Faults: ['InvalidArgument'] [ 2347.516982] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] [ 2347.517884] env[68964]: INFO nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Terminating instance [ 2347.518848] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2347.519116] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2347.519297] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a2bc30aa-d41c-4657-a185-03f8387936ad {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.521590] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2347.521780] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2347.522519] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04b8a119-7824-493e-a5b4-90cbbdcf659d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.529290] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2347.529500] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dd013442-5fab-483f-acf4-00ed477267eb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.531727] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2347.531898] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2347.532892] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-18f95feb-0c46-4b80-a9ad-98aeb466ddb9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.537565] env[68964]: DEBUG oslo_vmware.api [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for the task: (returnval){ [ 2347.537565] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]524a2624-eb09-4076-a17f-874b98f2190b" [ 2347.537565] env[68964]: _type = "Task" [ 2347.537565] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2347.544545] env[68964]: DEBUG oslo_vmware.api [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]524a2624-eb09-4076-a17f-874b98f2190b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2348.048379] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2348.048707] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Creating directory with path [datastore1] vmware_temp/e00bcb72-c8fd-4e1f-a656-e2510f97f765/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2348.048897] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c86afde0-00ff-4ddc-8a8e-1d9477719717 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.069210] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Created directory with path [datastore1] vmware_temp/e00bcb72-c8fd-4e1f-a656-e2510f97f765/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2348.069419] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Fetch image to [datastore1] vmware_temp/e00bcb72-c8fd-4e1f-a656-e2510f97f765/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2348.069627] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/e00bcb72-c8fd-4e1f-a656-e2510f97f765/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2348.070369] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49bb638e-0dd6-4919-8cbd-46e8987ffe34 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.076975] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98d93695-27ec-4e10-813e-eb03df21eb4e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.085907] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8464b167-d581-4b75-b455-095604478cfc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.118070] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e65443-957d-4238-befb-ee893c6316e3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.123817] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1b261038-8cc8-4d71-97a7-fb62cc10b399 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.148868] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2348.195939] env[68964]: DEBUG oslo_vmware.rw_handles [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e00bcb72-c8fd-4e1f-a656-e2510f97f765/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2348.255160] env[68964]: DEBUG oslo_vmware.rw_handles [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2348.255354] env[68964]: DEBUG oslo_vmware.rw_handles [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e00bcb72-c8fd-4e1f-a656-e2510f97f765/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2348.578709] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2348.578933] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2348.579133] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Deleting the datastore file [datastore1] a257b05d-fa9a-4d1a-9086-d571e45a5283 {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2348.579389] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d44071e3-220b-411d-94e8-db7050f061d8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.585570] env[68964]: DEBUG oslo_vmware.api [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Waiting for the task: (returnval){ [ 2348.585570] env[68964]: value = "task-3431806" [ 2348.585570] env[68964]: _type = "Task" [ 2348.585570] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2348.593119] env[68964]: DEBUG oslo_vmware.api [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Task: {'id': task-3431806, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2349.095591] env[68964]: DEBUG oslo_vmware.api [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Task: {'id': task-3431806, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07062} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2349.095921] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2349.095921] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2349.096103] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2349.096280] env[68964]: INFO nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Took 1.57 seconds to destroy the instance on the hypervisor. [ 2349.098623] env[68964]: DEBUG nova.compute.claims [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2349.098799] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2349.099014] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2349.244637] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eb7c9f2-c42c-4385-83a5-03748343e197 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.252057] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68948d18-1ee7-4b2e-925b-9c270a6aaff4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.281812] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-657f100e-884a-4ddf-9aaf-edb43dbff876 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.288713] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02c5affb-8829-4046-9019-a2b22d8d826d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.302907] env[68964]: DEBUG nova.compute.provider_tree [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2349.311262] env[68964]: DEBUG nova.scheduler.client.report [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2349.325603] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.227s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2349.326131] env[68964]: ERROR nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2349.326131] env[68964]: Faults: ['InvalidArgument'] [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Traceback (most recent call last): [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] self.driver.spawn(context, instance, image_meta, [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] self._fetch_image_if_missing(context, vi) [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] image_cache(vi, tmp_image_ds_loc) [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] vm_util.copy_virtual_disk( [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] session._wait_for_task(vmdk_copy_task) [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] return self.wait_for_task(task_ref) [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] return evt.wait() [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] result = hub.switch() [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] return self.greenlet.switch() [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] self.f(*self.args, **self.kw) [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] raise exceptions.translate_fault(task_info.error) [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Faults: ['InvalidArgument'] [ 2349.326131] env[68964]: ERROR nova.compute.manager [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] [ 2349.326962] env[68964]: DEBUG nova.compute.utils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2349.328527] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Build of instance a257b05d-fa9a-4d1a-9086-d571e45a5283 was re-scheduled: A specified parameter was not correct: fileType [ 2349.328527] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2349.328906] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2349.329089] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2349.329260] env[68964]: DEBUG nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2349.329421] env[68964]: DEBUG nova.network.neutron [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2349.629655] env[68964]: DEBUG nova.network.neutron [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2349.645599] env[68964]: INFO nova.compute.manager [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Took 0.32 seconds to deallocate network for instance. [ 2349.734095] env[68964]: INFO nova.scheduler.client.report [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Deleted allocations for instance a257b05d-fa9a-4d1a-9086-d571e45a5283 [ 2349.756057] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a6781777-079d-4ff7-8e27-8eac480087d6 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "a257b05d-fa9a-4d1a-9086-d571e45a5283" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 649.981s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2349.756923] env[68964]: DEBUG oslo_concurrency.lockutils [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "a257b05d-fa9a-4d1a-9086-d571e45a5283" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 453.588s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2349.756923] env[68964]: DEBUG oslo_concurrency.lockutils [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Acquiring lock "a257b05d-fa9a-4d1a-9086-d571e45a5283-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2349.756923] env[68964]: DEBUG oslo_concurrency.lockutils [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "a257b05d-fa9a-4d1a-9086-d571e45a5283-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2349.757152] env[68964]: DEBUG oslo_concurrency.lockutils [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "a257b05d-fa9a-4d1a-9086-d571e45a5283-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2349.758980] env[68964]: INFO nova.compute.manager [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Terminating instance [ 2349.761449] env[68964]: DEBUG nova.compute.manager [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2349.761449] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2349.761449] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b7aa10c2-471e-4f4e-86ed-d05a13038f21 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.770341] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-271da4c4-126c-4570-8af7-6736c5c8b56f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.797848] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a257b05d-fa9a-4d1a-9086-d571e45a5283 could not be found. [ 2349.798054] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2349.798243] env[68964]: INFO nova.compute.manager [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2349.798508] env[68964]: DEBUG oslo.service.loopingcall [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2349.798757] env[68964]: DEBUG nova.compute.manager [-] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2349.798829] env[68964]: DEBUG nova.network.neutron [-] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2349.822077] env[68964]: DEBUG nova.network.neutron [-] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2349.829568] env[68964]: INFO nova.compute.manager [-] [instance: a257b05d-fa9a-4d1a-9086-d571e45a5283] Took 0.03 seconds to deallocate network for instance. [ 2349.922879] env[68964]: DEBUG oslo_concurrency.lockutils [None req-63a1af0e-9452-42bd-9c79-c8f5604c97d2 tempest-ServerMetadataTestJSON-1851390513 tempest-ServerMetadataTestJSON-1851390513-project-member] Lock "a257b05d-fa9a-4d1a-9086-d571e45a5283" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.166s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2362.041053] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2362.041053] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2362.041053] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2362.060968] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2362.061102] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2362.061400] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2362.061593] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2362.061727] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2362.061850] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2362.061969] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2362.062098] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2362.723952] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2363.724607] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2363.724896] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2363.724972] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2364.724850] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2366.724711] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2368.726480] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2369.720941] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2370.724487] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2370.736351] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2370.736579] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2370.736733] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2370.736903] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2370.738065] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fefdfc33-ed85-4c17-8fab-2bf755141d7b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2370.747790] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf9fbaef-7664-47c9-9b1a-92850de357e1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2370.761747] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7611140-da6d-44fb-a394-fbe481689b27 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2370.767841] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5cab175-cc3a-447b-9821-0c5b648114dc {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2370.796509] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2370.796661] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2370.796827] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2370.857755] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2370.857868] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2370.858013] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance be9830e6-1e07-443b-b08e-cefac29e2e5c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2370.858173] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f4fdc36a-1a04-46ac-84ad-a6a05ae64e61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2370.858335] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dcd4de94-0433-416d-a9f6-c24f584a80ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2370.858477] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96d01266-87ae-4bb5-a047-c81dd74c0f24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2370.858644] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8d45312a-5084-40b7-b4f7-733a2285bb4d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2370.858832] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2370.859022] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2370.947162] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dd5a093-ca77-47af-9ed4-855a9709eaf8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2370.955598] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8268c776-334e-4635-a838-afeb2d7c63f1 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2370.988202] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cd1679e-9dde-4b8b-ad00-cc7cd01f755c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2370.994976] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea73eca6-4bdf-4151-9041-40a77667dd6c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2371.008199] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2371.016444] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2371.030600] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2371.030786] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2397.998384] env[68964]: WARNING oslo_vmware.rw_handles [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2397.998384] env[68964]: ERROR oslo_vmware.rw_handles [ 2397.999205] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/e00bcb72-c8fd-4e1f-a656-e2510f97f765/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2398.000887] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2398.001148] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Copying Virtual Disk [datastore1] vmware_temp/e00bcb72-c8fd-4e1f-a656-e2510f97f765/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/e00bcb72-c8fd-4e1f-a656-e2510f97f765/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2398.001461] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b2cdedc7-7313-45f0-9888-9cd9824ca1ab {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.009861] env[68964]: DEBUG oslo_vmware.api [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for the task: (returnval){ [ 2398.009861] env[68964]: value = "task-3431807" [ 2398.009861] env[68964]: _type = "Task" [ 2398.009861] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2398.017833] env[68964]: DEBUG oslo_vmware.api [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': task-3431807, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2398.521163] env[68964]: DEBUG oslo_vmware.exceptions [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2398.521446] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2398.522017] env[68964]: ERROR nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2398.522017] env[68964]: Faults: ['InvalidArgument'] [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Traceback (most recent call last): [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] yield resources [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] self.driver.spawn(context, instance, image_meta, [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] self._fetch_image_if_missing(context, vi) [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] image_cache(vi, tmp_image_ds_loc) [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] vm_util.copy_virtual_disk( [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] session._wait_for_task(vmdk_copy_task) [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] return self.wait_for_task(task_ref) [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] return evt.wait() [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] result = hub.switch() [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] return self.greenlet.switch() [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] self.f(*self.args, **self.kw) [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] raise exceptions.translate_fault(task_info.error) [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Faults: ['InvalidArgument'] [ 2398.522017] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] [ 2398.523150] env[68964]: INFO nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Terminating instance [ 2398.523872] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2398.524095] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2398.524354] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-19d4e2b2-89c4-4a25-945f-7aca44e2573a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.526473] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2398.526679] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2398.527406] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca52e1ea-811d-4afa-a18e-d6cf28cfd614 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.534178] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2398.534368] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-89361ba8-e4d3-4297-b9c1-d2e885f10965 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.536526] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2398.536718] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2398.537729] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f0636b1-fef4-40bd-ae62-d186248a6eea {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.542471] env[68964]: DEBUG oslo_vmware.api [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 2398.542471] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a51e80-1a80-9abb-7986-9360a942f107" [ 2398.542471] env[68964]: _type = "Task" [ 2398.542471] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2398.553084] env[68964]: DEBUG oslo_vmware.api [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a51e80-1a80-9abb-7986-9360a942f107, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2398.602037] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2398.602254] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2398.602430] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Deleting the datastore file [datastore1] bcd10e27-d4fe-4b00-a55a-1c52ef78fcad {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2398.602685] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-51384724-2831-4abe-beed-a337edad14b9 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.609357] env[68964]: DEBUG oslo_vmware.api [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for the task: (returnval){ [ 2398.609357] env[68964]: value = "task-3431809" [ 2398.609357] env[68964]: _type = "Task" [ 2398.609357] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2398.617007] env[68964]: DEBUG oslo_vmware.api [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': task-3431809, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2399.052602] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2399.052966] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating directory with path [datastore1] vmware_temp/f877f008-db55-43c3-86ef-1278bc28f530/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2399.053041] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e65fc62e-b788-4d05-991e-73338d0dabd4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.064040] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Created directory with path [datastore1] vmware_temp/f877f008-db55-43c3-86ef-1278bc28f530/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2399.064241] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Fetch image to [datastore1] vmware_temp/f877f008-db55-43c3-86ef-1278bc28f530/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2399.064439] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/f877f008-db55-43c3-86ef-1278bc28f530/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2399.065107] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62678d34-b54c-4a20-b745-ca5cab9fddbd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.073057] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42f2fbe6-f41b-4206-bb65-a67f518c1314 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.082248] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b84155f0-ed8d-4905-b999-7ffa9d567345 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.115425] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3386c96a-5879-4976-8869-0bc06823681a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.122652] env[68964]: DEBUG oslo_vmware.api [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Task: {'id': task-3431809, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.087001} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2399.124075] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2399.124277] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2399.124554] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2399.124607] env[68964]: INFO nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2399.126417] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b5e5ae66-4558-4c21-94d1-97305abff4ae {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.128339] env[68964]: DEBUG nova.compute.claims [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2399.128510] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2399.128733] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2399.151488] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2399.238857] env[68964]: DEBUG oslo_vmware.rw_handles [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f877f008-db55-43c3-86ef-1278bc28f530/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2399.298690] env[68964]: DEBUG oslo_vmware.rw_handles [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2399.298908] env[68964]: DEBUG oslo_vmware.rw_handles [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f877f008-db55-43c3-86ef-1278bc28f530/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2399.318085] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c048b8b-963a-42f0-8273-52c6d2986812 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.325346] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eacf43c7-76b2-4fd0-b5b5-1f401dd021a4 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.355585] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-890f4208-ddcd-4f41-8d8f-2337e758a43d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.362304] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70b91215-020f-4f95-9046-78aec8717048 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.374849] env[68964]: DEBUG nova.compute.provider_tree [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2399.383347] env[68964]: DEBUG nova.scheduler.client.report [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2399.398655] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.270s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2399.399205] env[68964]: ERROR nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2399.399205] env[68964]: Faults: ['InvalidArgument'] [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Traceback (most recent call last): [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] self.driver.spawn(context, instance, image_meta, [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] self._fetch_image_if_missing(context, vi) [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] image_cache(vi, tmp_image_ds_loc) [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] vm_util.copy_virtual_disk( [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] session._wait_for_task(vmdk_copy_task) [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] return self.wait_for_task(task_ref) [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] return evt.wait() [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] result = hub.switch() [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] return self.greenlet.switch() [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] self.f(*self.args, **self.kw) [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] raise exceptions.translate_fault(task_info.error) [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Faults: ['InvalidArgument'] [ 2399.399205] env[68964]: ERROR nova.compute.manager [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] [ 2399.399969] env[68964]: DEBUG nova.compute.utils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2399.401274] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Build of instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad was re-scheduled: A specified parameter was not correct: fileType [ 2399.401274] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2399.401630] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2399.401800] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2399.401969] env[68964]: DEBUG nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2399.402149] env[68964]: DEBUG nova.network.neutron [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2399.654157] env[68964]: DEBUG nova.network.neutron [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2399.668359] env[68964]: INFO nova.compute.manager [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Took 0.27 seconds to deallocate network for instance. [ 2399.763424] env[68964]: INFO nova.scheduler.client.report [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Deleted allocations for instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad [ 2399.784664] env[68964]: DEBUG oslo_concurrency.lockutils [None req-55a84f44-95a8-4c57-bd7e-d1cd72375587 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 536.969s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2399.784968] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 340.652s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2399.785208] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2399.785414] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2399.785579] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2399.788486] env[68964]: INFO nova.compute.manager [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Terminating instance [ 2399.790038] env[68964]: DEBUG nova.compute.manager [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2399.790038] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2399.790445] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e45155d3-c344-4034-bd50-7b65f4246f0e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.799522] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78e3b93c-8371-47c8-9774-e5e9c09bf3f8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2399.826973] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bcd10e27-d4fe-4b00-a55a-1c52ef78fcad could not be found. [ 2399.827194] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2399.827371] env[68964]: INFO nova.compute.manager [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2399.827608] env[68964]: DEBUG oslo.service.loopingcall [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2399.827825] env[68964]: DEBUG nova.compute.manager [-] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2399.827923] env[68964]: DEBUG nova.network.neutron [-] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2399.850056] env[68964]: DEBUG nova.network.neutron [-] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2399.859055] env[68964]: INFO nova.compute.manager [-] [instance: bcd10e27-d4fe-4b00-a55a-1c52ef78fcad] Took 0.03 seconds to deallocate network for instance. [ 2399.943149] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2585929c-7a3e-4bd9-a611-ccd19d60de26 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Lock "bcd10e27-d4fe-4b00-a55a-1c52ef78fcad" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.158s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2407.156254] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "7bb8b324-dbb0-4695-96b0-cc06db749fb1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2407.156550] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Lock "7bb8b324-dbb0-4695-96b0-cc06db749fb1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2407.166865] env[68964]: DEBUG nova.compute.manager [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2407.215107] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2407.215356] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2407.216800] env[68964]: INFO nova.compute.claims [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2407.341316] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74d32bf8-8d23-41c9-9bf5-e167a53a6d3b {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2407.349133] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b55c9f4-3ba6-4dc8-9399-151216c1676a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2407.379242] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a6be01a-8ddb-4ed9-811f-92d9ea4d6308 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2407.386339] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a8fb25e-248e-458d-a4a5-e960671fe4ee {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2407.400185] env[68964]: DEBUG nova.compute.provider_tree [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2407.409423] env[68964]: DEBUG nova.scheduler.client.report [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2407.421766] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2407.422223] env[68964]: DEBUG nova.compute.manager [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2407.453599] env[68964]: DEBUG nova.compute.utils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2407.454732] env[68964]: DEBUG nova.compute.manager [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2407.454900] env[68964]: DEBUG nova.network.neutron [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2407.463526] env[68964]: DEBUG nova.compute.manager [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2407.521738] env[68964]: DEBUG nova.policy [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57b204a484c24d2eaa9a909b7c831bc9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '315c6290ec974ff0b91c8856a6716aa3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2407.525388] env[68964]: DEBUG nova.compute.manager [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2407.551127] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2407.551484] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2407.551701] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2407.551967] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2407.552194] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2407.552406] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2407.552688] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2407.552910] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2407.553158] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2407.553387] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2407.553621] env[68964]: DEBUG nova.virt.hardware [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2407.554830] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0576791d-1b6f-4fbb-87e6-f9bd96ad7cbb {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2407.567910] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0729fb35-5f8e-4424-9690-01e90f1d572d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2407.836524] env[68964]: DEBUG nova.network.neutron [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Successfully created port: b7ec2517-5557-48da-8757-5155224af6e0 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2408.430237] env[68964]: DEBUG nova.compute.manager [req-0d467b71-3c80-4927-a68f-93cf1f386d60 req-2f42fc3a-8062-42f8-a127-f7b6cad7b618 service nova] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Received event network-vif-plugged-b7ec2517-5557-48da-8757-5155224af6e0 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2408.430510] env[68964]: DEBUG oslo_concurrency.lockutils [req-0d467b71-3c80-4927-a68f-93cf1f386d60 req-2f42fc3a-8062-42f8-a127-f7b6cad7b618 service nova] Acquiring lock "7bb8b324-dbb0-4695-96b0-cc06db749fb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2408.430644] env[68964]: DEBUG oslo_concurrency.lockutils [req-0d467b71-3c80-4927-a68f-93cf1f386d60 req-2f42fc3a-8062-42f8-a127-f7b6cad7b618 service nova] Lock "7bb8b324-dbb0-4695-96b0-cc06db749fb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2408.430808] env[68964]: DEBUG oslo_concurrency.lockutils [req-0d467b71-3c80-4927-a68f-93cf1f386d60 req-2f42fc3a-8062-42f8-a127-f7b6cad7b618 service nova] Lock "7bb8b324-dbb0-4695-96b0-cc06db749fb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2408.430974] env[68964]: DEBUG nova.compute.manager [req-0d467b71-3c80-4927-a68f-93cf1f386d60 req-2f42fc3a-8062-42f8-a127-f7b6cad7b618 service nova] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] No waiting events found dispatching network-vif-plugged-b7ec2517-5557-48da-8757-5155224af6e0 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2408.431726] env[68964]: WARNING nova.compute.manager [req-0d467b71-3c80-4927-a68f-93cf1f386d60 req-2f42fc3a-8062-42f8-a127-f7b6cad7b618 service nova] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Received unexpected event network-vif-plugged-b7ec2517-5557-48da-8757-5155224af6e0 for instance with vm_state building and task_state spawning. [ 2408.511492] env[68964]: DEBUG nova.network.neutron [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Successfully updated port: b7ec2517-5557-48da-8757-5155224af6e0 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2408.521228] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "refresh_cache-7bb8b324-dbb0-4695-96b0-cc06db749fb1" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2408.521373] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquired lock "refresh_cache-7bb8b324-dbb0-4695-96b0-cc06db749fb1" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2408.521523] env[68964]: DEBUG nova.network.neutron [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2408.558995] env[68964]: DEBUG nova.network.neutron [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2408.734914] env[68964]: DEBUG nova.network.neutron [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Updating instance_info_cache with network_info: [{"id": "b7ec2517-5557-48da-8757-5155224af6e0", "address": "fa:16:3e:51:4f:e9", "network": {"id": "0f42ddf1-c83c-4ac8-bfeb-12d61e91369b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1180077751-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "315c6290ec974ff0b91c8856a6716aa3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb7ec2517-55", "ovs_interfaceid": "b7ec2517-5557-48da-8757-5155224af6e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2408.748216] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Releasing lock "refresh_cache-7bb8b324-dbb0-4695-96b0-cc06db749fb1" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2408.748523] env[68964]: DEBUG nova.compute.manager [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Instance network_info: |[{"id": "b7ec2517-5557-48da-8757-5155224af6e0", "address": "fa:16:3e:51:4f:e9", "network": {"id": "0f42ddf1-c83c-4ac8-bfeb-12d61e91369b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1180077751-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "315c6290ec974ff0b91c8856a6716aa3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb7ec2517-55", "ovs_interfaceid": "b7ec2517-5557-48da-8757-5155224af6e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2408.748953] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:51:4f:e9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5f60c972-a72d-4c5f-a250-faadfd6eafbe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b7ec2517-5557-48da-8757-5155224af6e0', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2408.756364] env[68964]: DEBUG oslo.service.loopingcall [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2408.756810] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2408.757060] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-703eae5f-dfe6-4563-a7bb-41b26b5e3de3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2408.778124] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2408.778124] env[68964]: value = "task-3431810" [ 2408.778124] env[68964]: _type = "Task" [ 2408.778124] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2408.785903] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431810, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2409.288061] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431810, 'name': CreateVM_Task, 'duration_secs': 0.319262} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2409.288061] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2409.288680] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2409.288906] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2409.289216] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2409.289463] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8c3c83bd-7f01-4bdb-9804-13da14bb9cd0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2409.293770] env[68964]: DEBUG oslo_vmware.api [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Waiting for the task: (returnval){ [ 2409.293770] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a5db76-f80e-55f5-f280-ceb339442fe8" [ 2409.293770] env[68964]: _type = "Task" [ 2409.293770] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2409.302274] env[68964]: DEBUG oslo_vmware.api [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52a5db76-f80e-55f5-f280-ceb339442fe8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2409.804545] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2409.804891] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2409.804942] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a0c122d4-9da7-469b-b56f-422b0bf712d5 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2410.459597] env[68964]: DEBUG nova.compute.manager [req-f53fd7fa-e87b-4523-a65b-7ea5cdce1298 req-fd2b67e3-9b4c-4431-850a-ccb73449ac84 service nova] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Received event network-changed-b7ec2517-5557-48da-8757-5155224af6e0 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2410.459801] env[68964]: DEBUG nova.compute.manager [req-f53fd7fa-e87b-4523-a65b-7ea5cdce1298 req-fd2b67e3-9b4c-4431-850a-ccb73449ac84 service nova] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Refreshing instance network info cache due to event network-changed-b7ec2517-5557-48da-8757-5155224af6e0. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2410.460059] env[68964]: DEBUG oslo_concurrency.lockutils [req-f53fd7fa-e87b-4523-a65b-7ea5cdce1298 req-fd2b67e3-9b4c-4431-850a-ccb73449ac84 service nova] Acquiring lock "refresh_cache-7bb8b324-dbb0-4695-96b0-cc06db749fb1" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2410.460205] env[68964]: DEBUG oslo_concurrency.lockutils [req-f53fd7fa-e87b-4523-a65b-7ea5cdce1298 req-fd2b67e3-9b4c-4431-850a-ccb73449ac84 service nova] Acquired lock "refresh_cache-7bb8b324-dbb0-4695-96b0-cc06db749fb1" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2410.460367] env[68964]: DEBUG nova.network.neutron [req-f53fd7fa-e87b-4523-a65b-7ea5cdce1298 req-fd2b67e3-9b4c-4431-850a-ccb73449ac84 service nova] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Refreshing network info cache for port b7ec2517-5557-48da-8757-5155224af6e0 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2410.889571] env[68964]: DEBUG nova.network.neutron [req-f53fd7fa-e87b-4523-a65b-7ea5cdce1298 req-fd2b67e3-9b4c-4431-850a-ccb73449ac84 service nova] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Updated VIF entry in instance network info cache for port b7ec2517-5557-48da-8757-5155224af6e0. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2410.889950] env[68964]: DEBUG nova.network.neutron [req-f53fd7fa-e87b-4523-a65b-7ea5cdce1298 req-fd2b67e3-9b4c-4431-850a-ccb73449ac84 service nova] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Updating instance_info_cache with network_info: [{"id": "b7ec2517-5557-48da-8757-5155224af6e0", "address": "fa:16:3e:51:4f:e9", "network": {"id": "0f42ddf1-c83c-4ac8-bfeb-12d61e91369b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1180077751-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "315c6290ec974ff0b91c8856a6716aa3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5f60c972-a72d-4c5f-a250-faadfd6eafbe", "external-id": "nsx-vlan-transportzone-932", "segmentation_id": 932, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb7ec2517-55", "ovs_interfaceid": "b7ec2517-5557-48da-8757-5155224af6e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2410.898457] env[68964]: DEBUG oslo_concurrency.lockutils [req-f53fd7fa-e87b-4523-a65b-7ea5cdce1298 req-fd2b67e3-9b4c-4431-850a-ccb73449ac84 service nova] Releasing lock "refresh_cache-7bb8b324-dbb0-4695-96b0-cc06db749fb1" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2419.724787] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2421.734056] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2421.734056] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2421.734056] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2421.753393] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2421.753510] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2421.753629] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2421.753753] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2421.753876] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2421.753999] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2421.754135] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2421.754262] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2422.724964] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2423.724155] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2423.724442] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2423.724551] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2425.721742] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2425.739046] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2428.724057] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2428.724425] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2431.720053] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2431.724698] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2431.724852] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances with incomplete migration {{(pid=68964) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2432.733629] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager.update_available_resource {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2432.745581] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2432.745795] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2432.745972] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2432.746165] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68964) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2432.747340] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-947b8b5a-f532-4eee-b4e0-384961aa1b5d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2432.756037] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fad57250-2f18-4a79-bb2b-173de25b509c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2432.770580] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df412d07-dc91-441e-8d23-b64292c3b250 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2432.776707] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a53dfc49-8ce1-49f2-b564-85d981f368af {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2432.805043] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180937MB free_disk=98GB free_vcpus=48 pci_devices=None {{(pid=68964) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2432.805164] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2432.805339] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2432.907160] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance cc90a5a6-19e6-4674-ad06-2c840927409d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2432.907324] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance be9830e6-1e07-443b-b08e-cefac29e2e5c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2432.907455] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance f4fdc36a-1a04-46ac-84ad-a6a05ae64e61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2432.907579] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance dcd4de94-0433-416d-a9f6-c24f584a80ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2432.907703] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 96d01266-87ae-4bb5-a047-c81dd74c0f24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2432.907821] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 8d45312a-5084-40b7-b4f7-733a2285bb4d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2432.907937] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Instance 7bb8b324-dbb0-4695-96b0-cc06db749fb1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68964) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2432.908143] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2432.908324] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=68964) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2432.992452] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c44305d7-49e8-4ad7-8f73-6c33d0655235 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2433.000218] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b8d9dfa-1ca3-4f0b-bab0-9395702eaa79 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2433.030474] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ceb61a6-6de2-43f0-9f4e-04bd91508545 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2433.037039] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d54f188-37c9-4c33-b4b5-d140ecdd3074 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2433.049617] env[68964]: DEBUG nova.compute.provider_tree [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2433.057949] env[68964]: DEBUG nova.scheduler.client.report [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2433.070274] env[68964]: DEBUG nova.compute.resource_tracker [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68964) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2433.070440] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.265s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2445.724605] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2445.724939] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Cleaning up deleted instances {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2445.735387] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] There are 0 instances to clean {{(pid=68964) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2446.814730] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2446.815410] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Getting list of instances from cluster (obj){ [ 2446.815410] env[68964]: value = "domain-c8" [ 2446.815410] env[68964]: _type = "ClusterComputeResource" [ 2446.815410] env[68964]: } {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2446.816599] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa3649b0-9b41-4087-9dad-1049912efb46 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.834080] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Got total of 7 instances {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2448.517597] env[68964]: WARNING oslo_vmware.rw_handles [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles response.begin() [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2448.517597] env[68964]: ERROR oslo_vmware.rw_handles [ 2448.518292] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Downloaded image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to vmware_temp/f877f008-db55-43c3-86ef-1278bc28f530/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2448.520684] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Caching image {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2448.520947] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Copying Virtual Disk [datastore1] vmware_temp/f877f008-db55-43c3-86ef-1278bc28f530/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk to [datastore1] vmware_temp/f877f008-db55-43c3-86ef-1278bc28f530/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk {{(pid=68964) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2448.521244] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5e80e597-2898-4635-b717-6c65310a034e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.529380] env[68964]: DEBUG oslo_vmware.api [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 2448.529380] env[68964]: value = "task-3431811" [ 2448.529380] env[68964]: _type = "Task" [ 2448.529380] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2448.537926] env[68964]: DEBUG oslo_vmware.api [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': task-3431811, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2449.040169] env[68964]: DEBUG oslo_vmware.exceptions [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Fault InvalidArgument not matched. {{(pid=68964) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2449.040465] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2449.041035] env[68964]: ERROR nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2449.041035] env[68964]: Faults: ['InvalidArgument'] [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Traceback (most recent call last): [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] yield resources [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] self.driver.spawn(context, instance, image_meta, [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] self._fetch_image_if_missing(context, vi) [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] image_cache(vi, tmp_image_ds_loc) [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] vm_util.copy_virtual_disk( [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] session._wait_for_task(vmdk_copy_task) [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] return self.wait_for_task(task_ref) [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] return evt.wait() [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] result = hub.switch() [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] return self.greenlet.switch() [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] self.f(*self.args, **self.kw) [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] raise exceptions.translate_fault(task_info.error) [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Faults: ['InvalidArgument'] [ 2449.041035] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] [ 2449.042061] env[68964]: INFO nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Terminating instance [ 2449.042885] env[68964]: DEBUG oslo_concurrency.lockutils [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2449.043111] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2449.043355] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-815396d3-ceb7-4d9b-90ee-700bf82f0196 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.045554] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2449.045742] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2449.046475] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76ce9d9c-37dd-412b-bc5f-0bf5abf5fa71 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.053324] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Unregistering the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2449.053532] env[68964]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-918273b2-3d32-4e04-b97c-61fbfc1ad8bf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.055674] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2449.055844] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68964) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2449.056782] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a294ebe2-96e3-4f54-b13d-e0751e5ecfda {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.061317] env[68964]: DEBUG oslo_vmware.api [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Waiting for the task: (returnval){ [ 2449.061317] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52135c32-800b-64bc-7134-80a338f62234" [ 2449.061317] env[68964]: _type = "Task" [ 2449.061317] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2449.068424] env[68964]: DEBUG oslo_vmware.api [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52135c32-800b-64bc-7134-80a338f62234, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2449.118176] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Unregistered the VM {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2449.118424] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Deleting contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2449.118606] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Deleting the datastore file [datastore1] cc90a5a6-19e6-4674-ad06-2c840927409d {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2449.118863] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4fc93e0b-501b-4337-8de5-ac01950c7fd7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.125085] env[68964]: DEBUG oslo_vmware.api [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for the task: (returnval){ [ 2449.125085] env[68964]: value = "task-3431813" [ 2449.125085] env[68964]: _type = "Task" [ 2449.125085] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2449.133454] env[68964]: DEBUG oslo_vmware.api [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': task-3431813, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2449.571112] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Preparing fetch location {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2449.571427] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Creating directory with path [datastore1] vmware_temp/98f06cdf-6f93-4f8c-9a98-22f54bbfe97a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2449.571622] env[68964]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1f8ae7db-2a74-4fce-a6ec-be0fd013cae0 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.588233] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Created directory with path [datastore1] vmware_temp/98f06cdf-6f93-4f8c-9a98-22f54bbfe97a/b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2449.588450] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Fetch image to [datastore1] vmware_temp/98f06cdf-6f93-4f8c-9a98-22f54bbfe97a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2449.588622] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to [datastore1] vmware_temp/98f06cdf-6f93-4f8c-9a98-22f54bbfe97a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68964) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2449.589456] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a56e2881-3d3f-4395-9748-63fd7d95c19c {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.596403] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66ef26c8-1f20-40a3-81b6-17c26a25bf6e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.606544] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27cd11d7-65c7-4eba-aba5-d79f3db6b70a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.642642] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ac90849-2566-467b-a828-cf77de601cd3 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.650256] env[68964]: DEBUG oslo_vmware.api [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Task: {'id': task-3431813, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07853} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2449.651688] env[68964]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Deleted the datastore file {{(pid=68964) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2449.651874] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Deleted contents of the VM from datastore datastore1 {{(pid=68964) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2449.652056] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2449.652305] env[68964]: INFO nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2449.654076] env[68964]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9f85c27c-ad33-48b2-9d67-374463d080cf {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2449.655964] env[68964]: DEBUG nova.compute.claims [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Aborting claim: {{(pid=68964) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2449.656147] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2449.656367] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2449.676152] env[68964]: DEBUG nova.virt.vmwareapi.images [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Downloading image file data b0d1c28b-5c3d-4c47-808f-66751157cde6 to the data store datastore1 {{(pid=68964) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2449.969537] env[68964]: DEBUG oslo_vmware.rw_handles [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/98f06cdf-6f93-4f8c-9a98-22f54bbfe97a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2450.033181] env[68964]: DEBUG oslo_vmware.rw_handles [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Completed reading data from the image iterator. {{(pid=68964) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2450.033382] env[68964]: DEBUG oslo_vmware.rw_handles [None req-77f1718c-b1e3-4a04-a905-185e4f619247 tempest-AttachVolumeTestJSON-635176172 tempest-AttachVolumeTestJSON-635176172-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/98f06cdf-6f93-4f8c-9a98-22f54bbfe97a/b0d1c28b-5c3d-4c47-808f-66751157cde6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68964) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2450.132683] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2efaa5b-db65-4ea3-bbc3-a46dc741534f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2450.140356] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e145fd5-fb59-4113-adc2-a5fcb8d68ffd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2450.172059] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79bb7b1b-65e5-4c0f-b3c0-c1698a9673bd {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2450.178788] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f7b2372-5b38-4e76-8aeb-d8299e6d4fef {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2450.191670] env[68964]: DEBUG nova.compute.provider_tree [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2450.199923] env[68964]: DEBUG nova.scheduler.client.report [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2450.212559] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.556s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2450.213070] env[68964]: ERROR nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2450.213070] env[68964]: Faults: ['InvalidArgument'] [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Traceback (most recent call last): [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] self.driver.spawn(context, instance, image_meta, [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] self._fetch_image_if_missing(context, vi) [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] image_cache(vi, tmp_image_ds_loc) [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] vm_util.copy_virtual_disk( [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] session._wait_for_task(vmdk_copy_task) [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] return self.wait_for_task(task_ref) [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] return evt.wait() [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] result = hub.switch() [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] return self.greenlet.switch() [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] self.f(*self.args, **self.kw) [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] raise exceptions.translate_fault(task_info.error) [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Faults: ['InvalidArgument'] [ 2450.213070] env[68964]: ERROR nova.compute.manager [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] [ 2450.213797] env[68964]: DEBUG nova.compute.utils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] VimFaultException {{(pid=68964) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2450.215082] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Build of instance cc90a5a6-19e6-4674-ad06-2c840927409d was re-scheduled: A specified parameter was not correct: fileType [ 2450.215082] env[68964]: Faults: ['InvalidArgument'] {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2450.215457] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Unplugging VIFs for instance {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2450.215624] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68964) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2450.215790] env[68964]: DEBUG nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2450.215949] env[68964]: DEBUG nova.network.neutron [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2450.535087] env[68964]: DEBUG nova.network.neutron [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2450.552547] env[68964]: INFO nova.compute.manager [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Took 0.33 seconds to deallocate network for instance. [ 2450.647513] env[68964]: INFO nova.scheduler.client.report [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Deleted allocations for instance cc90a5a6-19e6-4674-ad06-2c840927409d [ 2450.670750] env[68964]: DEBUG oslo_concurrency.lockutils [None req-2d3fbf9c-7737-40e6-af00-4b5b8bffb4b7 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "cc90a5a6-19e6-4674-ad06-2c840927409d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 579.110s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2450.671008] env[68964]: DEBUG oslo_concurrency.lockutils [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "cc90a5a6-19e6-4674-ad06-2c840927409d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 382.591s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2450.671251] env[68964]: DEBUG oslo_concurrency.lockutils [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Acquiring lock "cc90a5a6-19e6-4674-ad06-2c840927409d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2450.671460] env[68964]: DEBUG oslo_concurrency.lockutils [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "cc90a5a6-19e6-4674-ad06-2c840927409d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2450.671624] env[68964]: DEBUG oslo_concurrency.lockutils [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "cc90a5a6-19e6-4674-ad06-2c840927409d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2450.673588] env[68964]: INFO nova.compute.manager [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Terminating instance [ 2450.675325] env[68964]: DEBUG nova.compute.manager [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Start destroying the instance on the hypervisor. {{(pid=68964) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2450.675519] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Destroying instance {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2450.675990] env[68964]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e59b952d-64a1-477a-8d7a-bac0a22a497a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2450.685177] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abb8bcef-6b31-4ab5-a012-a63dac29c82a {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2450.711565] env[68964]: WARNING nova.virt.vmwareapi.vmops [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cc90a5a6-19e6-4674-ad06-2c840927409d could not be found. [ 2450.711769] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Instance destroyed {{(pid=68964) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2450.711947] env[68964]: INFO nova.compute.manager [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2450.712213] env[68964]: DEBUG oslo.service.loopingcall [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2450.712664] env[68964]: DEBUG nova.compute.manager [-] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Deallocating network for instance {{(pid=68964) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2450.712765] env[68964]: DEBUG nova.network.neutron [-] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] deallocate_for_instance() {{(pid=68964) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2450.735595] env[68964]: DEBUG nova.network.neutron [-] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Updating instance_info_cache with network_info: [] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2450.743662] env[68964]: INFO nova.compute.manager [-] [instance: cc90a5a6-19e6-4674-ad06-2c840927409d] Took 0.03 seconds to deallocate network for instance. [ 2450.823962] env[68964]: DEBUG oslo_concurrency.lockutils [None req-236d0f2b-5a05-4433-82d9-5c6d1530e7c9 tempest-ServerDiskConfigTestJSON-2069407207 tempest-ServerDiskConfigTestJSON-2069407207-project-member] Lock "cc90a5a6-19e6-4674-ad06-2c840927409d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.153s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2451.366948] env[68964]: DEBUG oslo_concurrency.lockutils [None req-0ffb2735-5cf2-4cf4-99b8-5f23f563aeb5 tempest-ServersTestJSON-1379046078 tempest-ServersTestJSON-1379046078-project-member] Acquiring lock "96d01266-87ae-4bb5-a047-c81dd74c0f24" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2466.198539] env[68964]: DEBUG oslo_concurrency.lockutils [None req-b0016c3e-372e-4266-9bec-91a21847fb53 tempest-ServerMetadataNegativeTestJSON-1864369537 tempest-ServerMetadataNegativeTestJSON-1864369537-project-member] Acquiring lock "8d45312a-5084-40b7-b4f7-733a2285bb4d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.804604] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._sync_power_states {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2471.821157] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Getting list of instances from cluster (obj){ [ 2471.821157] env[68964]: value = "domain-c8" [ 2471.821157] env[68964]: _type = "ClusterComputeResource" [ 2471.821157] env[68964]: } {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2471.821871] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34e781ee-b340-4b3a-9dd9-a964bec572b8 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2471.835878] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Got total of 6 instances {{(pid=68964) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2471.836072] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid be9830e6-1e07-443b-b08e-cefac29e2e5c {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2471.836266] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid f4fdc36a-1a04-46ac-84ad-a6a05ae64e61 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2471.836426] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid dcd4de94-0433-416d-a9f6-c24f584a80ad {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2471.836577] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 96d01266-87ae-4bb5-a047-c81dd74c0f24 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2471.836725] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 8d45312a-5084-40b7-b4f7-733a2285bb4d {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2471.836871] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Triggering sync for uuid 7bb8b324-dbb0-4695-96b0-cc06db749fb1 {{(pid=68964) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2471.837197] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "be9830e6-1e07-443b-b08e-cefac29e2e5c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.837427] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "f4fdc36a-1a04-46ac-84ad-a6a05ae64e61" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.837629] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "dcd4de94-0433-416d-a9f6-c24f584a80ad" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.837825] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "96d01266-87ae-4bb5-a047-c81dd74c0f24" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.838027] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "8d45312a-5084-40b7-b4f7-733a2285bb4d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2471.838229] env[68964]: DEBUG oslo_concurrency.lockutils [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Acquiring lock "7bb8b324-dbb0-4695-96b0-cc06db749fb1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2475.672502] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "cba9e33b-752e-4a5f-87cd-d9ca96a91ebe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2475.672810] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "cba9e33b-752e-4a5f-87cd-d9ca96a91ebe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2475.682237] env[68964]: DEBUG nova.compute.manager [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Starting instance... {{(pid=68964) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2475.728332] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2475.728601] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2475.730027] env[68964]: INFO nova.compute.claims [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2475.864008] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dc264aa-8207-4546-a0ae-9fdf3990c43d {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2475.871497] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acfc5150-13b7-417d-aaf4-3a36947b4b22 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2475.900628] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72ce82ac-c6f6-4986-8f72-6fb7ef62d3e7 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2475.907316] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cc46a3c-0890-48f1-854a-922cf6698a9f {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2475.920148] env[68964]: DEBUG nova.compute.provider_tree [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed in ProviderTree for provider: 63b0294e-f555-48a6-a542-3466427066a9 {{(pid=68964) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2475.928406] env[68964]: DEBUG nova.scheduler.client.report [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Inventory has not changed for provider 63b0294e-f555-48a6-a542-3466427066a9 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 98, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68964) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2475.942189] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2475.942659] env[68964]: DEBUG nova.compute.manager [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Start building networks asynchronously for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2475.972830] env[68964]: DEBUG nova.compute.utils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Using /dev/sd instead of None {{(pid=68964) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2475.973975] env[68964]: DEBUG nova.compute.manager [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Allocating IP information in the background. {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2475.974158] env[68964]: DEBUG nova.network.neutron [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] allocate_for_instance() {{(pid=68964) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2475.982955] env[68964]: DEBUG nova.compute.manager [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Start building block device mappings for instance. {{(pid=68964) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2476.027749] env[68964]: DEBUG nova.policy [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21b5b62c1d9a4afc8e26b122ce6de51c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '07b4913b8fef4ee3a0d920bc36fefd18', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68964) authorize /opt/stack/nova/nova/policy.py:203}} [ 2476.042509] env[68964]: DEBUG nova.compute.manager [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Start spawning the instance on the hypervisor. {{(pid=68964) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2476.066521] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-15T06:03:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-15T06:03:15Z,direct_url=,disk_format='vmdk',id=b0d1c28b-5c3d-4c47-808f-66751157cde6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='5759c8ac0b114e32b09097edb04a3e9b',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-15T06:03:16Z,virtual_size=,visibility=), allow threads: False {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2476.066758] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2476.066925] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image limits 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2476.067117] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Flavor pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2476.067264] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Image pref 0:0:0 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2476.067418] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68964) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2476.067626] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2476.067788] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2476.067939] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Got 1 possible topologies {{(pid=68964) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2476.068111] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2476.068282] env[68964]: DEBUG nova.virt.hardware [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68964) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2476.069126] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fda73afc-8714-430a-b723-fd8244c0e498 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.076936] env[68964]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf00af34-aafc-4c54-a40f-f378482ad568 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2476.350778] env[68964]: DEBUG nova.network.neutron [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Successfully created port: 53132281-6c7d-4cf8-823f-e54bce164e98 {{(pid=68964) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2476.910909] env[68964]: DEBUG nova.compute.manager [req-949f50ef-fe57-4c8e-a578-b490e20b1e14 req-d7b15477-1fbe-43aa-bfa3-b2b7b6992abd service nova] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Received event network-vif-plugged-53132281-6c7d-4cf8-823f-e54bce164e98 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2476.911165] env[68964]: DEBUG oslo_concurrency.lockutils [req-949f50ef-fe57-4c8e-a578-b490e20b1e14 req-d7b15477-1fbe-43aa-bfa3-b2b7b6992abd service nova] Acquiring lock "cba9e33b-752e-4a5f-87cd-d9ca96a91ebe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2476.911351] env[68964]: DEBUG oslo_concurrency.lockutils [req-949f50ef-fe57-4c8e-a578-b490e20b1e14 req-d7b15477-1fbe-43aa-bfa3-b2b7b6992abd service nova] Lock "cba9e33b-752e-4a5f-87cd-d9ca96a91ebe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2476.911514] env[68964]: DEBUG oslo_concurrency.lockutils [req-949f50ef-fe57-4c8e-a578-b490e20b1e14 req-d7b15477-1fbe-43aa-bfa3-b2b7b6992abd service nova] Lock "cba9e33b-752e-4a5f-87cd-d9ca96a91ebe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68964) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2476.911676] env[68964]: DEBUG nova.compute.manager [req-949f50ef-fe57-4c8e-a578-b490e20b1e14 req-d7b15477-1fbe-43aa-bfa3-b2b7b6992abd service nova] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] No waiting events found dispatching network-vif-plugged-53132281-6c7d-4cf8-823f-e54bce164e98 {{(pid=68964) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2476.911833] env[68964]: WARNING nova.compute.manager [req-949f50ef-fe57-4c8e-a578-b490e20b1e14 req-d7b15477-1fbe-43aa-bfa3-b2b7b6992abd service nova] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Received unexpected event network-vif-plugged-53132281-6c7d-4cf8-823f-e54bce164e98 for instance with vm_state building and task_state spawning. [ 2476.990826] env[68964]: DEBUG nova.network.neutron [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Successfully updated port: 53132281-6c7d-4cf8-823f-e54bce164e98 {{(pid=68964) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2477.002088] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "refresh_cache-cba9e33b-752e-4a5f-87cd-d9ca96a91ebe" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2477.003692] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "refresh_cache-cba9e33b-752e-4a5f-87cd-d9ca96a91ebe" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2477.003692] env[68964]: DEBUG nova.network.neutron [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Building network info cache for instance {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2477.041551] env[68964]: DEBUG nova.network.neutron [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Instance cache missing network info. {{(pid=68964) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2477.194320] env[68964]: DEBUG nova.network.neutron [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Updating instance_info_cache with network_info: [{"id": "53132281-6c7d-4cf8-823f-e54bce164e98", "address": "fa:16:3e:d6:31:1a", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap53132281-6c", "ovs_interfaceid": "53132281-6c7d-4cf8-823f-e54bce164e98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2477.204841] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "refresh_cache-cba9e33b-752e-4a5f-87cd-d9ca96a91ebe" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2477.205138] env[68964]: DEBUG nova.compute.manager [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Instance network_info: |[{"id": "53132281-6c7d-4cf8-823f-e54bce164e98", "address": "fa:16:3e:d6:31:1a", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap53132281-6c", "ovs_interfaceid": "53132281-6c7d-4cf8-823f-e54bce164e98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68964) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2477.205485] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d6:31:1a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92fe29b3-0907-453d-aabb-5559c4bd7c0f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '53132281-6c7d-4cf8-823f-e54bce164e98', 'vif_model': 'vmxnet3'}] {{(pid=68964) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2477.212920] env[68964]: DEBUG oslo.service.loopingcall [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68964) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2477.213354] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Creating VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2477.213576] env[68964]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c28f10dd-d163-4938-8ea1-6e8ad46e8b4e {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2477.233638] env[68964]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2477.233638] env[68964]: value = "task-3431814" [ 2477.233638] env[68964]: _type = "Task" [ 2477.233638] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2477.244250] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431814, 'name': CreateVM_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2477.744267] env[68964]: DEBUG oslo_vmware.api [-] Task: {'id': task-3431814, 'name': CreateVM_Task, 'duration_secs': 0.287375} completed successfully. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2477.744429] env[68964]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Created VM on the ESX host {{(pid=68964) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2477.745100] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2477.745269] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2477.745583] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2477.745831] env[68964]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-43a879ec-640c-41d9-b0e6-4dfd0ba76e32 {{(pid=68964) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2477.750231] env[68964]: DEBUG oslo_vmware.api [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Waiting for the task: (returnval){ [ 2477.750231] env[68964]: value = "session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d6780d-d708-a6e9-a598-01fe792eca15" [ 2477.750231] env[68964]: _type = "Task" [ 2477.750231] env[68964]: } to complete. {{(pid=68964) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2477.757753] env[68964]: DEBUG oslo_vmware.api [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Task: {'id': session[52c3dc42-4111-4006-e4cf-4dc9273e9e44]52d6780d-d708-a6e9-a598-01fe792eca15, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68964) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2478.261237] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Releasing lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2478.261594] env[68964]: DEBUG nova.virt.vmwareapi.vmops [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Processing image b0d1c28b-5c3d-4c47-808f-66751157cde6 {{(pid=68964) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2478.261665] env[68964]: DEBUG oslo_concurrency.lockutils [None req-a85509ce-f85d-4416-8092-7e2e697a2057 tempest-DeleteServersTestJSON-337123157 tempest-DeleteServersTestJSON-337123157-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/b0d1c28b-5c3d-4c47-808f-66751157cde6/b0d1c28b-5c3d-4c47-808f-66751157cde6.vmdk" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2478.943724] env[68964]: DEBUG nova.compute.manager [req-842e8152-3814-4b0b-aad6-3b41c734eb67 req-30ce1191-ee6d-4c54-b751-8228fcb722b7 service nova] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Received event network-changed-53132281-6c7d-4cf8-823f-e54bce164e98 {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2478.943926] env[68964]: DEBUG nova.compute.manager [req-842e8152-3814-4b0b-aad6-3b41c734eb67 req-30ce1191-ee6d-4c54-b751-8228fcb722b7 service nova] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Refreshing instance network info cache due to event network-changed-53132281-6c7d-4cf8-823f-e54bce164e98. {{(pid=68964) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2478.944153] env[68964]: DEBUG oslo_concurrency.lockutils [req-842e8152-3814-4b0b-aad6-3b41c734eb67 req-30ce1191-ee6d-4c54-b751-8228fcb722b7 service nova] Acquiring lock "refresh_cache-cba9e33b-752e-4a5f-87cd-d9ca96a91ebe" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2478.944296] env[68964]: DEBUG oslo_concurrency.lockutils [req-842e8152-3814-4b0b-aad6-3b41c734eb67 req-30ce1191-ee6d-4c54-b751-8228fcb722b7 service nova] Acquired lock "refresh_cache-cba9e33b-752e-4a5f-87cd-d9ca96a91ebe" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2478.944464] env[68964]: DEBUG nova.network.neutron [req-842e8152-3814-4b0b-aad6-3b41c734eb67 req-30ce1191-ee6d-4c54-b751-8228fcb722b7 service nova] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Refreshing network info cache for port 53132281-6c7d-4cf8-823f-e54bce164e98 {{(pid=68964) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2479.213622] env[68964]: DEBUG nova.network.neutron [req-842e8152-3814-4b0b-aad6-3b41c734eb67 req-30ce1191-ee6d-4c54-b751-8228fcb722b7 service nova] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Updated VIF entry in instance network info cache for port 53132281-6c7d-4cf8-823f-e54bce164e98. {{(pid=68964) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2479.213983] env[68964]: DEBUG nova.network.neutron [req-842e8152-3814-4b0b-aad6-3b41c734eb67 req-30ce1191-ee6d-4c54-b751-8228fcb722b7 service nova] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Updating instance_info_cache with network_info: [{"id": "53132281-6c7d-4cf8-823f-e54bce164e98", "address": "fa:16:3e:d6:31:1a", "network": {"id": "1328a6fa-a33d-40ad-a988-8c9bb0465df4", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1378996195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "07b4913b8fef4ee3a0d920bc36fefd18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92fe29b3-0907-453d-aabb-5559c4bd7c0f", "external-id": "nsx-vlan-transportzone-482", "segmentation_id": 482, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap53132281-6c", "ovs_interfaceid": "53132281-6c7d-4cf8-823f-e54bce164e98", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68964) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2479.222950] env[68964]: DEBUG oslo_concurrency.lockutils [req-842e8152-3814-4b0b-aad6-3b41c734eb67 req-30ce1191-ee6d-4c54-b751-8228fcb722b7 service nova] Releasing lock "refresh_cache-cba9e33b-752e-4a5f-87cd-d9ca96a91ebe" {{(pid=68964) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2481.758313] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2481.758313] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Starting heal instance info cache {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2481.758626] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Rebuilding the list of instances to heal {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2481.776090] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: be9830e6-1e07-443b-b08e-cefac29e2e5c] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2481.776271] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: f4fdc36a-1a04-46ac-84ad-a6a05ae64e61] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2481.776380] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: dcd4de94-0433-416d-a9f6-c24f584a80ad] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2481.776505] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 96d01266-87ae-4bb5-a047-c81dd74c0f24] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2481.776625] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 8d45312a-5084-40b7-b4f7-733a2285bb4d] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2481.776744] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: 7bb8b324-dbb0-4695-96b0-cc06db749fb1] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2481.776862] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] [instance: cba9e33b-752e-4a5f-87cd-d9ca96a91ebe] Skipping network cache update for instance because it is Building. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2481.776978] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Didn't find any instances for network info cache update. {{(pid=68964) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2483.724590] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2483.725594] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2483.725967] env[68964]: DEBUG nova.compute.manager [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68964) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2485.726541] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2487.724752] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2489.725777] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2489.726832] env[68964]: DEBUG oslo_service.periodic_task [None req-be866ec5-5117-4591-9afa-45193ee995ee None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68964) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}}